To set up a minimal environment with Elasticsearch and Docker
- install Docker for your OS
- install Docker Compose (optional)
To run Elasticsearch the following docker command uses the official Docker image. Open your terminal and run:
docker run -p 9200:9200 -p 9300:9300 --name elasticsearch \
-e "discovery.type=single-node" docker.elastic.co/elasticsearch/elasticsearch:7.8.1This starts Elasticsearch in single node mode. Open another terminal tab and check that it's ready at localhost:9200. It may take a few minutes until the node is available.
For those who like to organize Docker containers via Docker Compose, here is a short version that sets up an Elasticsearch instance and Cerebro, a UI tool to inspect / manage a ES cluster.
Copy the following content into a new text file named docker-compose.yml, save it.
# docker-compose.yml
version: "3.6"
services:
elasticsearch01:
image: docker.elastic.co/elasticsearch/elasticsearch:7.8.1
volumes:
- esdata1:/usr/share/elasticsearch/data
ports:
- 9200:9200
- 9300:9300
environment:
- cluster.name=docker-cluster
- cluster.routing.allocation.disk.threshold_enabled=false
- discovery.type=single-node
- bootstrap.memory_lock=true
- http.cors.enabled=true
- http.cors.allow-origin=*
- "ES_JAVA_OPTS=-Xms1024m -Xmx1024m"
ulimits:
memlock:
soft: -1
hard: -1
networks:
- esnet
cerebro:
image: lmenezes/cerebro:0.9.2
ports:
- 9000:9000
networks:
- esnet
volumes:
pgdata:
esdata1:
driver: local
networks:
esnet:To start both containers run docker compose with:
docker-compose up --build -dYou can also omit the -d (detach, to run in background) to see the logs of both containers. It may take a few minutes, once everything started the following ports are available:
- localhost:9200 (Elasticsearch)
- localhost:9000 (Cerebro)