Big Bang Deployment is quite straightforward, where we roll out a new version in one go with service downtime. We roll back to the previous version if the deployment fails.
In blue-green deployment, two environments are deployed in production simultaneously. Once the green environment passes the tests, the load balancer switches users to it.
With canary deployment, only a small portion of instances are upgraded with the new version, once all the tests pass, a portion of users are routed to canary instances.
With the feature toggle, A small portion of users with a specific flag go through the code of the new feature, while other users go through normal code.
๐ก No downtime โ
๐ก Targeted users โ
7/ ๐ Over to you: Which deployment strategies have you used?
1/ Is it possible to run C, C++, or Rust on a web browser?
2/ What is ๐ฐ๐๐ ๐๐ฌ๐ฌ๐๐ฆ๐๐ฅ๐ฒ (WASM)? Why does it attract so much attention?
The diagram shows how we can run native C/C++/Rust code inside a web browser with WASM.
3/ Traditionally, we can only work with Javascript in the web browser, and the performance cannot compare with native code like C/C++ because it is interpreted.
The diagram below shows why ๐ซ๐๐๐ฅ-๐ญ๐ข๐ฆ๐ ๐ ๐๐ฆ๐ข๐ง๐ and ๐ฅ๐จ๐ฐ-๐ฅ๐๐ญ๐๐ง๐๐ฒ ๐ญ๐ซ๐๐๐ข๐ง๐ applications should not use microservice architecture.
/2 There are some common features of these applications, which make them choose monolithic architecture:
/3 ๐นThese applications are very ๐ฅ๐๐ญ๐๐ง๐๐ฒ-๐ฌ๐๐ง๐ฌ๐ข๐ญ๐ข๐ฏ๐. For real-time gaming, the latency should be at the milli-second level; for low-latency trading, the latency should be at the micro-second level.
/2 First gen: the organic evolution. Uber's architecture in 2014 would have two key services: dispatch and API. A dispatch service connects a rider with a driver, while an API service stores the long-term data of users and trips.
/3 Second gen: the all-encompassing gateway. Uber adopted a microservice architecture very early on. By 2019, Uber's products were powered by 2,200+ microservices as a result of this architectural decision.