Parse in 2017 — remembering the forgotten, using PostgreSQL, and some benchmarks
Parse.com (doesn’t exist anymore) created a storm in every frontend developer’s teacup by announcing a shutdown.
This dropped on us, and #parseshutdown was trending on Twitter for a week
So why are we discussing Parse, 6 months after it’s demise ? Because it is not dead. At least not the ‘idea of Parse’. Read on . . .
The beginning of the mBaaS wars
Parse is a story of selling your tech company to someone who didn’t quite buy it for its real value, but just ‘coz they weren’t able to figure out where to throw money at. Parse started off as a _BaaS _ — Backend as a Service (A kind of service which gained immense popularity in the 2012–2014 era, when Amazon and Google hadn’t yet diversified their cloud offerings into ‘functions as a service’ and ‘image recognition as a service’ level yet. AWS and GCE were still enterprise. Smaller teams preferred DigitalOcean. But what about indie developers with only frontend skills ? Especially iOS or Android developers (because with rise of NodeJS, all web frontend developers are de-facto backend developers too). That’s where Parse came in. And an entire hoard followed. Backendless , Shephertz , Backand , Kinvey . . . an entire portfolio of the 2015 app bubble. The hallmark used to be way-too-easy-to-use iOS + Android + WinPhone +JS libraries, which used to magically do analytics, database, ACL, social login and SSO, push notifications and nominal server-side business logic for you.
Among all of them, Parse worked out really well. And it was snapped up by Facebook in their mid 2010’s shopping spree (including Whatsapp, Wit.ai and a laundry list of others). Facebook didn’t know at that time they’d end up being an ad company like Google. They figured they need to hold their fort on DevOps, as Microsoft (Azure), Amazon (AWS), Google (GCE) and Twitter (Fabric) all had that covered. For the first year, that was a much needed boost to Parse. It went from strengths to strengths, and a significantly large number of “app startups” were building entirely front-end-only teams, running on top of Parse. Google snapped up Firebase, and the debate of 2015 was ‘Firebase vs Parse’. (Both had databases, although admittedly very differently structured, and both had the other bling features like social login and push notifications.). Parse, obviously, was upselling Facebook login, and Firebase Google login.
Fast-forward to today, and things are way different. Facebook realised spreading fake news and earning ad dollars is their thing. They announced Parse will be shutdown in 2016 (effective Jan 2017). This created the biggest blip on anyone running sentiment analysis on Twitter for ‘developers’ user group, for sure. #parseshutdown was trending for a week. Million dollar startups suddenly realised ‘building a backend’ is also a thing, and most people build their own. On the other end of the valley, Twitter has realised the same — and have handed over Fabric to Google, who not have a mBaaS juggernaut in form of Firebase (read: http://blog.championswimmer.in/2016/10/don-t-jump-onto-the-firebase-bandwagon-yet ). AWS, Azure and GCE all now have mobile backend microservices (Push, Auth, ACL, Geocoding, On-Demand function execution), and more importantly, they are easy to use even indie devs and small teams.
Parse is dead, long live parse
(click the link to see a nice presentation)
Everyone is entitled to their opinion, but I think the Parse shutdown was the best thing to happen to Parse. As they say — what doesn’t kill you, makes you stronger, and the Parse shutdown did not kill Parse. In the software world, how most softwares survive a corporate backer turning away is by going open source and turning to the community. And Parse has a huge community — all those startups deeply invested into the Parse way of doing things. And thus was born open source Parse Server
Do note that the open source Parse server is not at all the original code base of Parse.com. It started off as Parse developer advocate Fosco Marotto’s effort to create an open source alternative at a Facebook hackathon. Creating parse-server wasn’t an afterthought to to Parse shutdown, but was something in the works previously.
The problem with proprietary mBaaS solutions is the lock-in. The reason I will never trust anything production level to run on Firebase is the same. I hate funny looking database structures (Firebase uses a tree db. Like wtf? And then makes it realtime with websocket observers). Parse-Server going open source meant, no lock-in. And since it was a ground up effort, you can take a look at Parse-Server codebase, you’ll see it is a very very thin wrapper on MongoDB, trying to make the API 100% compatible with the Parse client SDKs.
Side note: One major factor that made people love Parse was that it was hosted and managed. You didn’t have to sysadmin our devops your way into hosting Parse. If you’re looking for that, there’s always back4app. It’s not Facebook. But then Facebook stabbed Parse.com in the back too — so large company = longer support isn’t really a thing. Google and turn off Firebase tomorrow if they please too.
Welcome to an all new modular Parse: 1-script-backend instead of serverless
What Parse going open source has meant is that it has become highly modular. It is used by thousands of people in different use cases. So Parse has come up with an ‘Adapter’ scheme. There’s a FileAdapter, a StorageAdapter, and so on. Which means
- You don’t want FCM for push? Use onesignal instead
- Don’t want to save files on disk, use S3 or GCS instead.
- And the BIGGEST OF ALL — Don’t want to use MongoDB ? Use PostgreSQL instead.
- You can get an entire list of modules here
Parse, which in it’s original shape, was probably never imaginable with anything but MongoDB can now run on PostgreSQL. This is nothing short of a wonder. (Yes there are drawbacks. You add random keys into your objects, the collection does an ALTER TABLE
, but hey, you have an ORM with live-query on PostgreSQL.)
Spinning it up yourself (optionally with Postgres, and with a GUI dashboard)
The best way to use Parse Server is to use it like a middleware in your Express application. Here’s an example nodejs script for the same.
And voila that gives you parse server running on your port 1337
. This script is also ready to run on Heroku
About the script a few things —
- We do not have a web-ui way to manage it yet. We need to add parse-dashboard for that
- It used MongoDB evidently. We want to use PostgreSQL there.
Let us see how it looks after changing that.
So here we create two parse servers. One using MongoDB and anothe using Postgres. How is that done ? Simple if the databaseURI
starts with mongodb://
it will automatically use MongoStorageAdapter. If it starts with postgres://
, then PostgresStorageAdapter. This facility is already available in the latest version of parse-server.
Our dashboard showing both Postgres and MongoDB versions running
Here is a proper repository, that is also hosted on Heroku for you to try out
How about some benchmarks ?
I did a very rudimentary benchmark. Inserted 1000 rows. And did 1000 where queries. Both using the REST API of parse. Once on the MongoDB instance and once on the PostgreSQL once.
These are done on a 2015 Macbook Pro 12,1 — i5, 8GB. With Webstorm and Firefox open parallelly. So these aren’t exactly pristine and pure benchmarks.
==== mongodb benchmark create ====
real 0m13.509s
user 0m3.549s
sys 0m2.964s
= = = = = = = = = = = = = = = =
==== mongodb benchmark get(where)====
real 0m12.839s
user 0m3.439s
sys 0m2.939s
= = = = = = = = = = = = = = = =
==== postgres benchmark create ====
real 0m14.736s
user 0m3.638s
sys 0m3.153s
= = = = = = = = = = = = = = = =
==== postgres benchmark get(where)====
real 0m13.986s
user 0m3.470s
sys 0m2.982s
= = = = = = = = = = = = = = = =
This is the script used to benchmark — https://github.com/championswimmer/parse-heroku-postgres/blob/master/benchmark/curlbench.sh
As we can see, from the perspective of speed, Postgres doesn’t outperform MongoDB, because of the simple reason that the JOINs are done in-memory here. And no indexing is automatically done for you, so that can improve performance too (significantly, if I may say so).
But I’d go with Postgres over MongoDB for some cases even with Parse because of two reasons
- http://www.mongodb-is-web-scale.com/
- If my schema is tight and highly related (think transaction records), and does not have lose objects with unspecified keys. Because then I can manually index respective columns and profit.
Keep in mind of root level keys are added to a model, the corresponding table goes under a ALTER TABLE
surgery. Changes in internal (nested) levels, simply use JSONB. Postgres supporting JSON can be easily exploited.
let GameScore = Parse.Object.extend("GameScore"); //Table createlet gameScore = new GameScore(); // Row creategameScore.set("score", 1337); // ALTER TABLE ADD COLUMN scoregameScore.set("playerName", "Sean Plott"); //ALTER TABLElet records = {wins: 10, losses: 5, draws: 4)gameScore.set("records", records); //ALTER TABLE with JSONB columngameScore.set("records.wins", 12); // **NO ALTER TABLE HERE**
Parse and the legacy : It still makes sense to use it. More than ever.
Every developer worth his salt, would have reservations against using backend-less or server-less solutions. I do too. I have spent the greater part of last 6 years learning a wide variety of languages and platforms, and 9/10 cases, if I have to put a project into production, the only backend I would trust is one I (or my team) has made ourself(ves).
That said, I believe, tools like Parse have a place of their own.
- It is a blessing to whip up quick frontend demos (at hackathons or pitches)
- You often need to spin out a simple website, for example a basic CRM-like setup so that you can create custom forms (to collect data about leads at your frontdesk, or reviews of your product etc). You don’t want to get into scaffolding boilerplate SQL and model classes and CRUD apis.
- You may want to get started with a frontend or an app that required some of the sugar Parse throws on top of the DB like live queries and push notifications and automatical file handling with AWS S3 or Google Cloud Storage. You can start with Parse, and since it is open source, you can later get into the nuts and bolts and swap our parts with your hand made solution to improve performance if you need.
Caution and knowledge are a person’s greatest assets
Don’t blindly jump onto the Parse gravy train. There are caveats.
- The philosophy of data storage in Parse is very document-store oriented (given their heavy investment into MongoDB). This is not a silver bullet. There are many cases where SQL+RDBMS can outperform document stores by orders of magnitude. Even when you use Postgres, Parse still has to stay close to its document store API. Hence the performance seen above.
- The strength as well as the biggest threat Parse possesses is the way too easy to use client libraries. I mean just look at the API. It’s so nicely done. http://docs.parseplatform.org/android/guide/#queries
http://docs.parseplatform.org/ios/guide/#queries
But these crutches might make you averse to using REST APIs the old school way. You will get locked into the particular way of manipulating data, and plain old CRUD will feel difficult, and you won’t be able to easily move out of this ecosystem later if you like.
If there is an easy way of doing things, it will be done. The popularity of Javascript in event electronics, desktop apps is a proof (read: Atwood’s Law). The fact that almost 25% of all world’s websites are Wordpress is also proof of the same. If there’s a way to get a full fledged backend for your apps and webapps, by writing a single NodeJS script, I am betting a lot of people are going to do that.