Over the years I have tried to improve this answer as much as possible.
In this thread with Memes, I will do my best.
a /thread (with Memes)
When an application is developed, we have:
- a development environment
- a production environment.
The application is created, improved, new features are added, bugs are corrected.
All of this happens in the development environment.
No memes? wait for it...
We have the usual problems:
- install dependencies
- understand what the problem is
- add a new shiny feature
And we also write a lot of code...
Once the development is done, developers are happy!
Why? because “It works on my machine”.
What's the problem?
When we go into production, we would like to recreate the same, identical environment.
What we can do is write a procedure to recreate the same exact environment. We could have issues because, for example, we are on a different operating system!
We would have to reinstall the same dependencies and we would have to re-create the same configuration on the production machine, and we would have many problems.
Now, to be completely honest, this problem has already been solved in the past: With virtual machines.
We create a virtual machine, which contains an operating system, and all necessary dependencies and configurations
Then we run the virtual machine on the operating system of the physical machine through a Hypervisor (a tool to manage Virtual Machines).
Problem solved, right?
Well yes and no, because even virtual machines have their own operating system, and they must be configured correctly to work, so we are (almost) at the starting point.
To create virtual machines we have a very long configuration written somewhere, and if something changes it must be carefully noted down.
Furthermore, it is quite evident that we are facing a waste of resources: We are recreating an entire system on top of a system
We would just ship our (maybe small) application with our dependencies!
So why am I here saying say that containers are cool and that Docker is awesome?
Because containers solve the same problem, the shipping of an application, in a much simpler and smarter way.
Containerization technologies (Docker is just one of them) use a standard, to isolate the application with all:
- dependencies
- configuration
- code
in something called Images.
Based on these images, containers can be created and started.
To make an analogy with the world of programming, an image is a class, and a container is an instance of that class.
The good news is that a ton of images, of the most famous technologies, already exists!
Another advantage of using containerization since development is the possibility of using different versions for different projects at the same time (without even installing them).
For things like Node.js, there are solutions, like NVM, to switch between versions, but for others is not that easy.
Let's say we want to use Postgres version 10 for one project but we need to use Postgres 9 for another.
And we also have Postgres 12 installed.
Now, using all 3 of them, changing ports every time depending on which project you are working on, is really a pain and it can lead to headaches.
Instead, we would like to have separated dev environments for the 2 projects, without configuring ad-hoc machines
We could solve this using VM, but then we should manage them and it's very resource consuming
This with docker can be avoided, simply by using a version of Postgres present on Docker Hub (a docker registry that contains a lot of images).
Also in a company, a machine can be configured so different developers can work on different projects using different versions at the same time
They can work on different projects without colliding.
They can work on different versions of the same project!
Last thing...
Docker can't solve all your problems!
But it can make your life easier as a developer, which is already a lot!
If you like this style of teaching, please follow @FrancescoCiull4 and RT this to spread the word. Thank you
It was 2015. I was curious and started researching. I didn’t have any online presence at the time so I was just studying on my own trying to figure out how it worked.
Now I know many Docker Captains, like @BretFisher@mikesir87 , @GianArb!
What is your favorite Docker command?
This is a nice question! I think I will go with “docker compose up –build”, this is exactly what you need to test your command on your development environment.
Resources that link Blockchain to Docker🐳
· Create Ethereum Dapp with React + Docker
· Deploying Blockchain Applications with Docker
· Docker usage in Blockchain
· Docker in Blockchain Projects
· Go Ethereum (Go implementation of Ethereum protocol)
Blockchain blocks hold batches of valid transactions into a Merkle tree.
Basic concepts to study how they work:
· Hash
· Sign
· Genesis Block
· Fork
· Consensus Algorithm
· Peers and the database
· History
· New entries
· Redundant Computation
/thread
Hash
A hash is a math function that converts an input of arbitrary length into an encrypted output of a fixed length.
· Each block includes the cryptographic hash of the previous, linking the two.
·Blocks are hashed and encoded.
Digital signature
A digital signature is an auth mechanism.
It enables the creator of the message to attach a code that acts as a signature.
Here is a thread to summarizes what I have studied so far.
/thread
Summary:
· What is a Blockchain
· Blocks
· Resistance to modification
· Secure by design
· Structure
· Verification
· Robust workflow
· Value Exchange protocol
· Layers
· What is a Blockchain
It's a growing list of records (blocks)
The Blocks are linked together using cryptography.
It's described as an immutable data storage:
- trustless
- fully decentralized
- peer-to-peer
- immutable
It's spread over a network of participants (nodes)