Looking Back, Looking Ahead
Today's Objectives:
- Appreciate our journey together
- Deploying backend code
- Exposure to many platforms
- A look at the landscape
Our course together
Thank you all for making this course possible, it was experimental, and I know that I would do somethings differently if I could do it again. But I feel that each of you have grown through your efforts and I'm proud to have helped that growth. Here is a quick look at the super-powers I hope you now have:
- The ability to wield SQL to your will.
- The ability to see databases through the lense of relational calculus.
- The ability to connect PHP, Python, and Node applications to SQLite, MySQL, and MongoDB databases
- The ability to use an ORM to keep your business logic modular and your database clean.
- The ability to use a view to synthesize a particular perspective on your data.
- The ability to design a data architecture for the needs of an application.
- The ability to create (and use) HTTP driven REST APIs.
- The ability to use *nix command line shells to create things quickly.
- The ability to write programs which extract data from almost any text source.
- The ability to use transactions to protect your data from corruption.
- The ability to recover a corrupted database.
- The ability to create an ERD for communicating the important relationships in your data.
- The ability to create custom SQL functions.
- The ability to use keys, functional dependencies, triggers, and normalization to create error-resistent tables.
- The ability to analyze queries and judiciously create indices to improve query speed.
- The ability to create backendless, data-driven, applications through firebase.
- The ability to creatively join your tables in SQL.
- The ability to populate relationships in your denormalized noSQL schema.
- The ability to make early mistakes and use rapid prototyping for faster success.
- The ability to push yourself through documentation, tutorials, google, stackoverflow and hurdles to create a meaningful product.
If you have other super powers to add to the list let me know. If you have feedback for what you enjoyed, disliked, would improve, or are grateful for in this class then PLEASE leave a course evaluation.
Backend Deployment
We have use cloud9, your local machines, and the ECE servers to do our work so far. So what do you do when you want to take your work public?
If you are interacting with the database in an application, like a mobile app or research project then maybe you don't need a server. But most of our work has had the web in mind. To serve your API to the world you'll a server and a domain name. For most of you I would use a shared host like Web Faction, this has the benefit of good technical support, not worrying about DDOS attacks or vulnerabilities, and all of the tools
you've grown accustomed to. A shared host will usually cost you $5-10 dollars per month and a domain name $5-20 per year.
Since I don't want to make you pay for something in this class we'll do a somewhat popular alternative using free components that we understand. Use MongoLab to create a free MongoDB backend (which we could scale as needed). Then we will use Heroku to serve our node application (again we could scale as needed). This will now have a public URL, backend storage, and the ability to scale if our little start-up takes off on us.
Task: deploy a mongo app to the world using heroku.
- Go to MongoLab and create an account.
- Start a new mongoDB deployment (use the free tier backed by Amazon Web Services).
- Click the database in the browser.
- Create a new user and password.
- Note the connection data on that screen. Copy the mongo command.
- Fork my quick-stack GitHub repo then begin a cloud9 space from it.
- Type
npm install
- Connect to your mongolab database from the command line on cloud9 with a command like this:
mongo ds031581.mongolab.com:31581/july28mongo -u user -p password
- Do the command
db.characters.insert({name: "Scout Finch"});
- Exit Mongo
- On line 2 of
server.js
have mongoose connect to your mongolab using the earlier URI. Should look something like: mongoose.connect('mongodb://user:password@ds031581.mongolab.com:31581/july28mongo');
- From your command line run the app with
npm start
.
- Preview the running app, and confirm that you see Scout on screen.
- You have now created a full stack (BackboneJS app btw) using MongoLab as your backend.
git commit -am 'pointed to mongolab db'
So far this isn't too unusual for us. I'm using cloud9 as the starting point but you could have run this on your local machine if that is more comfortable. We essentially just confirmed that the code works and can reach mongolab. Next we want to push the entire app to the cloud using Heroku:
- Create a Heroku acount.
- From your cloud9 command line install the heroku toolbelt:
wget -O- https://toolbelt.heroku.com/install-ubuntu.sh | sh
heroku login
heroku create
git push heroku master
heroku ps:scale web=1
- Look into the messages from your git push command to find the web address of your app.
- Visit the app, create some characters by clicking the button then refresh the page and see that they are still there.
heroku run bash
will open a terminal into your "application" and you can then poke around.
- exit the heroku terminal (cntrl c)
- Login to the mongo database again and see the new characters.
- If you want an SQL database in your app you could instead do:
heroku addons:create heroku-postgresql:hobby-dev
- This creates a PostGreS(QL) database which you can connect to like:
heroku pg:psql
Extra task: As a challenge, can you deploy your 'social network' to Heroku?
By the way you now kind of know PostGres out of the box, it is a community-driven alternative to MySQL which is gaining steam (I think it is because it has regular expressions in its queries).
Alternatives
This gift to you is a list of other ways to deploy backend code.
- Amazon Web Services (EC2, Elastic BeanStalk, S3, Glacier, Lambda, API Gateway). There are some AWESOME things you can do at cloud scale starting for free using Amazon. If you are interested, then set aside a week, and make some clear cut goals. They charge for tech assistance and getting your hello world going is sometimes a bit hairy. You can use them as a server, or for data, or for long term storage of data, or for node apps, or web workers, etc. Have fun.
- Pagoda Box is the PHP version of Heroku.
- Open Shift is the RedHat version and it feels a bit more like a proper server.
- Digital Ocean is a sort of shared hosting where you can spawn up your own machines as needed and pay by the hour. (GitHub education pack includes free digital ocean hosting.)
- Linode is in the same arena, you can create machine after machine all with various OSes and connected to each other. You get a 24 hour trial period so plan wisely if you want to try for free. This would be a good way to practice, sharding, multiple SQL databases.
- Web Faction is my favorite shared-host platform for developers. This is what you run your personal website on.
- A Small Orange is another contender in that area, a little cheaper, but less featured for developers.
Databases in your career
In this course we touched on three of the most in-demand programming languages of 2015. The reason this is important is because each of us is producing a ton of data every day. In our inter-connected world being able to manage this data and make sense of it is a very important skills.
I wanted to give you some buzzwords that I though would be just beyond the scope of this class but that you should spend some time (in August) learning about. Some big data tools to play with:
- Hadoop/Hive: For large scale data hadoop is the filesystem of choice, and Hive (or MapReduce) is the way to query that data. To work at PetaByte scale you'll need a technology like this under your belt.
- Cassandra: A popular alternative to MongoDB, it out performs mongo in many tasks but isn't quite as general.
- Presto: Probably the next big data contender, many companies have already switched.
Now what are people doing with this data? Looking for patterns. A good thing to practice doing with your data is creating:
- Bayesian Networks: which are basically used to help internet companies decide how to flow their users. The upshot is this, if someone is at a certain stage in our site/process what are the odds of making a sale and do they go up or down if their next step is X or Y?
- Markov Chains, the denser version of Bayesian networks, help analyze the flow from state to state of a system.
Other Databases that rule the world and you should try some comparisons with:
- PostgreSQL
- Microsoft SQL Server
- CouchDB
- HBAse
- Redis
- Oracle Database
- IBM DB2
Any Questions?
Write me an email anytime and I'll try my best to respond and be of any help I can be. Good luck out there!