Buy @ Amazon

Why ProtoBuf? A hands-on lab to motivate you to try it out.


"You eat for yourself but dress to the occasion." goes the wisdom from one generation to the other. A similar wisdom is passed on from the experienced programmers to the enthusiasts, that goes like, "You code for others (humans) in your team"

Martin Fowler shows his witty side by saying the same thing as, "Any fool can write code that a computer can understand. Good programmers write code that humans can understand".

Unfortunately though, this is an often neglected piece of advice thus resulting in fragile code-bases. I'd like to put this in the context of API contracts and data interchange/serialization formats for communications between distributed systems. SOAP/XML based protocols were once favorite for interoperability and then came REST/JSON beating it in reducing the verbosity and leveraging the HTTP infrastructure in its favor. Google later introduced ProtoBuf as an alternative with strong type-safety via gRPC. 

REST/JSON in my opinion should be the first choice for inter-systems communications over the network, especially when you want to leverage HTTP infrastructure for good. But, how about systems involving high traffic with extreme demands for low latency? This is the time I would opt for gRPC/Protobuf. How about web-socket communication for high-volume low-latency communications? You can and should leverage Protobuf data-serialization format for this. But yes, most teams overlook opting for Protobuf mostly out of ignorance. And then, there are teams that consciously opt for packing domain data in raw byte array format to experience best system performance but at the cost of brittle code and obscure API-contract. While the modern tools like Swagger/OpenAPI are used to communicate this, it can lead to potential miscommunication in spite of the best efforts. 

In this post, I intend to work your way through to understand and appreciate how protobuf as a tool brings 100x productivity to teams that build and consume APIs (particularly the ones that pack data as bytes and send it as binary packet to meet high scalability and latency needs of a systems) developed using ProtoBuf.

If you are game, get a stopwatch and another buddy of yours to start working on the exercise below, with each attempting to solve it on their own. You are expected to draft the schema of the response with a couple of examples to show how the response adhering to the schema would look like.

--

--

If you are done, compare your works with that of your buddy's and answer the below questions:

  1. How long did each of you take to finish?
  2. Did both of you get it right?
It is very likely that both of you would have gone back and forth reading and re-reading for a good while only to come up with a different solution from one another and both not having it right. This is what happens in the course of consumption of various APIs and no wonder API integrations are known to be a pain.

ProtoBuf bridges this gap in terms of human-readable contract with the bonus of code-generation of DTOs (Data Transfer Objects) in the language of your choice (ProtoBuf supports most popular languages), thus reducing a lot of friction in API consumption and boosting your productivity many-fold. This also is a way to reduce the friction between the producer and a consumer of an API.

If you are now motivated enough to learn more about it, do read All you need to know about ProtoBuf as your next step.