- Install the appropriate Docker application for your operating system.
- If you’re running on Linux, install Docker Compose.
Make sure that Docker is allotted at least 4GB of memory. In Docker Desktop, you configure resource usage on the Advanced tab in Preferences (macOS) or Settings (Windows).
- Clone this repository to your local machine.
- Create the
.env
file and enter strong password values for both theELASTIC_PASSWORD
andKIBANA_PASSWORD
variables.You must use the
ELASTIC_PASSWORD
value for further interactions with the cluster. TheKIBANA_PASSWORD
value is only used internally when configuring Kibana. - Create and start the three-node Elasticsearch cluster and Kibana instance:
docker-compose up -d
- When the deployment has started, open a browser and navigate to http://localhost:5601 to access Kibana, where you can load sample data and interact with your cluster.
To stop the cluster, run docker-compose down
. The data in the Docker volumes is preserved and loaded when you restart the cluster with docker-compose up
.
docker-compose down
To delete the network, containers, and volumes when you stop the cluster, specify the -v
option:
docker-compose down -v
- Open the terminal on your computer. Run the following command to check the health of your cluster:
curl --location 'http://localhost' \ --header 'Content-Type: application/json' \ --data '{ "message": "Hello World!", "timestamp": "2023-06-11T22:32:03.762Z" }'
- Go to Kibana's menu. Click on Management and then Stack Management.
- Click on Kibana and then Data Views. Click on Create data view.
- Type
logstash-*
in the Index pattern field and click on Save data view to Kibana. - Go back to Kibana's menu. Navigate to Analytics and then Discover.