Skip to content

Instantly share code, notes, and snippets.

@gamerson
Created April 3, 2019 18:06
Show Gist options
  • Select an option

  • Save gamerson/a846d5e27a2a0f811d9f4743c143e7d2 to your computer and use it in GitHub Desktop.

Select an option

Save gamerson/a846d5e27a2a0f811d9f4743c143e7d2 to your computer and use it in GitHub Desktop.
./gradlew startDxpCloudLocal log
~/my repos/liferaygreg(master ✔) ./gradlew startDxpCloudLocal
..:: Liferay Workspace for DXP Cloud generated from archetype 'wedeploy-dxpcloud-workspace:2.3.0' ::..
:wedeploy:clean
:wedeploy:buildDxpCloudServices
Collecting static DXP Cloud files from '/Users/greg/my repos/liferaygreg/wedeploy':
* file 'wedeploy/database/wedeploy.json'
* file 'wedeploy/liferay/license/common/license.xml'
* file 'wedeploy/liferay/config/uat/portal-env.properties'
* file 'wedeploy/liferay/config/local/portal-env.properties'
* file 'wedeploy/liferay/config/common/com.liferay.portal.search.elasticsearch6.configuration.ElasticsearchConfiguration.config'
* file 'wedeploy/liferay/config/common/portal-all.properties'
* file 'wedeploy/liferay/config/prd/portal-env.properties'
* file 'wedeploy/liferay/config/dev/portal-env.properties'
* file 'wedeploy/liferay/wedeploy.json'
* file 'wedeploy/ci/wedeploy.json'
* file 'wedeploy/search/wedeploy.json'
* file 'wedeploy/webserver/config/uat/liferay.conf'
* file 'wedeploy/webserver/config/common/temporary-upload.conf'
* file 'wedeploy/webserver/config/dev/liferay.conf'
* file 'wedeploy/webserver/deploy/uat/.htpasswd'
* file 'wedeploy/webserver/deploy/common/temporary-upload.html'
* file 'wedeploy/webserver/deploy/prd/.htpasswd'
* file 'wedeploy/webserver/deploy/dev/.htpasswd'
* file 'wedeploy/webserver/wedeploy.json'
* file 'wedeploy/docker-compose.yml'
* file 'wedeploy/backup/wedeploy.json'
:wedeploy:createMinimalBundleSrc
:wedeploy:zipMinimalBundleToBundlesCache
Minimal bundle created: /Users/greg/.liferay/bundles/wedeploy-bundle-minimal.zip
:downloadBundle SKIPPED
:distBundle
:wedeploy:copyBuiltWorkspaceModules NO-SOURCE
:wedeploy:distWeDeploy
Services for DXP Cloud were built and are ready to be deployed from 'build/wedeploy':
* database
* liferay
* ci
* search
* webserver
* backup
:wedeploy:startDxpCloudLocal
Starting Liferay stack for DXP Cloud locally, using command:
docker-compose --file /Users/greg/my repos/liferaygreg/build/wedeploy/docker-compose.yml --project-name dxpcloud_liferaygreg up --abort-on-container-exit --renew-anon-volumes --build --force-recreate
Recreating dxpcloud_liferaygreg_search_1 ... done
Recreating dxpcloud_liferaygreg_database_1 ... done
Recreating dxpcloud_liferaygreg_liferay_1 ... done
Recreating dxpcloud_liferaygreg_webserver_1 ... done
<===========--> 88% EXECUTING Attaching to dxpcloud_liferaygreg_database_1, dxpcloud_liferaygreg_search_1, dxpcloud_liferaygreg_liferay_1, dxpcloud_liferaygreg_webserver_1
search_1 | Starting Elasticsearch instance.
search_1 |
search_1 | BUILD_DATE:
search_1 | BUILD_VCS_REF:
search_1 | BUILD_VERSION:
search_1 | CLUSTER_NAME: liferay_cluster
search_1 | NODE_MASTER: true
search_1 | HOSTNAME: 9eaa428e25bb
search_1 | NODE_DATA: true
search_1 | NODE_INGEST: true
search_1 | PING_UNICAST_HOST: search
search_1 | MAX_LOCAL_STORAGE_NODES: 1
search_1 | NETWORK_HOST: _site_
search_1 | MEMORY_LOCK: false
search_1 | NUMBER_OF_MASTERS: 1
search_1 | ENABLE_XPACK_SECURITY: false
search_1 |
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] mysqld (mysqld 10.2.20-MariaDB-1:10.2.20+maria~bionic) starting as process 1 ...
liferay_1 | Starting Liferay DXP instance.
liferay_1 |
liferay_1 | LIFERAY_HOME: /opt/liferay
liferay_1 | BUILD_DATE: 2019-01-31T14:15:21Z
liferay_1 | BUILD_VCS_REF: da4177db53fdf42191a3aeba935ed7bd477a15cb
liferay_1 | BUILD_VERSION: 7.1.0-ga1-fp6-2.0.4
liferay_1 |
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: Uses event mutexes
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: Compressed tables use zlib 1.2.11
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: Using Linux native AIO
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: Number of pools: 1
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: Using SSE2 crc32 instructions
search_1 | ##
search_1 | ## Config
search_1 | ##
liferay_1 | ##
liferay_1 | ## Config
liferay_1 | ##
search_1 | 'config' directory found. The following contents are going to be copied to /usr/share/elasticsearch/config :
search_1 |
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: Initializing buffer pool, total size = 256M, instances = 1, chunk size = 128M
liferay_1 | 'config' directory found. The following contents are going to be copied to /opt/liferay and/or /opt/liferay/osgi/configs:
liferay_1 |
search_1 | /wedeploy-container/config
search_1 | `-- common
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: Completed initialization of buffer pool
database_1 | 2019-04-03 18:04:11 140557316691712 [Note] InnoDB: If the mysqld execution user is authorized, page cleaner thread priority can be changed. See the man page of setpriority().
search_1 |
search_1 |
liferay_1 | /wedeploy-container/config
liferay_1 | ├── common
liferay_1 | │   ├── com.liferay.portal.search.elasticsearch6.configuration.ElasticsearchConfiguration.config
liferay_1 | │   └── portal-all.properties
liferay_1 | ├── dev
liferay_1 | │   └── portal-env.properties
liferay_1 | ├── local
liferay_1 | │   └── portal-env.properties
liferay_1 | ├── prd
liferay_1 | │   └── portal-env.properties
liferay_1 | └── uat
liferay_1 | └── portal-env.properties
search_1 | ##
search_1 | ## License
search_1 | ##
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: Highest supported file format is Barracuda.
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: 128 out of 128 rollback segments are active.
liferay_1 |
liferay_1 |
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: Creating shared tablespace for temporary tables
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
search_1 | 'license' directory found. The following licenses will be put into elasticsearch:
search_1 |
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] InnoDB: 5.7.24 started; log sequence number 1620043
liferay_1 | ##
liferay_1 | ## Deploy
liferay_1 | ##
database_1 | 2019-04-03 18:04:11 140557082941184 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool
database_1 | 2019-04-03 18:04:11 140557082941184 [Note] InnoDB: Buffer pool(s) load completed at 190403 18:04:11
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] Plugin 'FEEDBACK' is disabled.
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] Server socket created on IP: '::'.
database_1 | 2019-04-03 18:04:11 140558057260992 [Warning] 'proxies_priv' entry '@% root@5dc3450df807' ignored in --skip-name-resolve mode.
search_1 | /wedeploy-container/license
search_1 | `-- common
liferay_1 | 'deploy' directory found. The content below will be copied according to the following:
liferay_1 | .jar files will be copied to /opt/liferay/osgi/modules
liferay_1 | .lpkg files will be copied to /opt/liferay/osgi/marketplace
liferay_1 | .war files will be copied to /opt/liferay/osgi/war
liferay_1 |
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] Reading of all Master_info entries succeded
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] Added new Master_info '' to hash table
database_1 | 2019-04-03 18:04:11 140558057260992 [Note] mysqld: ready for connections.
database_1 | Version: '10.2.20-MariaDB-1:10.2.20+maria~bionic' socket: '/var/run/mysqld/mysqld.sock' port: 3306 mariadb.org binary distribution
search_1 | ls: cannot access /tmp/license/*.json: No such file or directory
liferay_1 | /wedeploy-container/deploy
liferay_1 | ├── common
liferay_1 | ├── dev
liferay_1 | ├── prd
liferay_1 | └── uat
liferay_1 |
liferay_1 |
liferay_1 | ##
liferay_1 | ## License
liferay_1 | ##
liferay_1 | 'license' directory found. The following contents are going to be copied to /opt/liferay/deploy and/or /opt/liferay/data:
liferay_1 |
liferay_1 | /wedeploy-container/license
liferay_1 | ├── common
liferay_1 | │   └── license.xml
liferay_1 | ├── dev
liferay_1 | ├── prd
liferay_1 | └── uat
liferay_1 |
liferay_1 |
liferay_1 | ##
liferay_1 | ## Hot fix
liferay_1 | ##
search_1 |
search_1 |
liferay_1 | No 'hotfix' directory found. If you wish to apply hotfixes to Liferay make sure
liferay_1 | to drop your hotfixes (zip) files in the 'hotfix' directory.
liferay_1 |
liferay_1 |
liferay_1 | ##
liferay_1 | ## Cluster
liferay_1 | ##
liferay_1 | Cluster not enabled. If you want to enable cluster please set the environment variable WEDEPLOY_PROJECT_LIFERAY_CLUSTER_ENABLED to true
liferay_1 |
liferay_1 |
search_1 | ##
search_1 | ## Script
search_1 | ##
liferay_1 | ##
liferay_1 | ## Monitor
liferay_1 | ##
liferay_1 | Monitor is not enabled. If you want to enable please set the environment variables
liferay_1 | WEDEPLOY_PROJECT_MONITOR_DYNATRACE_TENANT and WEDEPLOY_PROJECT_MONITOR_DYNATRACE_TOKEN.
liferay_1 |
liferay_1 |
search_1 | 'script' directory found. The following contents are going to be executed
search_1 |
liferay_1 | ##
liferay_1 | ## Script
search_1 | /wedeploy-container/script
search_1 | `-- common
search_1 | Ignoring /tmp/script/*
search_1 |
search_1 |
search_1 |
search_1 | ##
search_1 | ## deploy
liferay_1 | ##
liferay_1 | 'script' directory found. The following contents are going to be executed
liferay_1 |
search_1 | ##
liferay_1 | /wedeploy-container/script
liferay_1 | ├── common
liferay_1 | ├── dev
liferay_1 | ├── prd
liferay_1 | └── uat
liferay_1 | Ignoring /tmp/script/*
liferay_1 |
liferay_1 |
liferay_1 |
search_1 | No 'deploy' directory found. If you wish to install additional ElasticSearch
search_1 | plugins, make sure to drop your *.plugins files into 'deploy' directory.
search_1 |
search_1 |
webserver_1 | 2019-04-03 18:04:12,281 CRIT Supervisor running as root (no user in config file)
webserver_1 | 2019-04-03 18:04:12,294 CRIT Server 'unix_http_server' running without any HTTP authentication checking
webserver_1 | 2019-04-03 18:04:12,295 INFO supervisord started with pid 1
liferay_1 | 03-Apr-2019 18:04:12.816 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server version: Apache Tomcat/9.0.6
liferay_1 | 03-Apr-2019 18:04:12.821 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server built: Mar 5 2018 09:34:35 UTC
liferay_1 | 03-Apr-2019 18:04:12.822 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server number: 9.0.6.0
liferay_1 | 03-Apr-2019 18:04:12.822 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Name: Linux
liferay_1 | 03-Apr-2019 18:04:12.823 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Version: 4.9.125-linuxkit
liferay_1 | 03-Apr-2019 18:04:12.823 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Architecture: amd64
liferay_1 | 03-Apr-2019 18:04:12.824 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Java Home: /usr/lib/jvm/java-8-openjdk-amd64/jre
liferay_1 | 03-Apr-2019 18:04:12.824 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Version: 1.8.0_171-8u171-b11-1~bpo8+1-b11
liferay_1 | 03-Apr-2019 18:04:12.824 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Vendor: Oracle Corporation
liferay_1 | 03-Apr-2019 18:04:12.825 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_BASE: /opt/liferay/tomcat
liferay_1 | 03-Apr-2019 18:04:12.825 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_HOME: /opt/liferay/tomcat
liferay_1 | 03-Apr-2019 18:04:12.826 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.config.file=/opt/liferay/tomcat/conf/logging.properties
liferay_1 | 03-Apr-2019 18:04:12.826 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
liferay_1 | 03-Apr-2019 18:04:12.827 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djdk.tls.ephemeralDHKeySize=2048
liferay_1 | 03-Apr-2019 18:04:12.827 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.protocol.handler.pkgs=org.apache.catalina.webresources
liferay_1 | 03-Apr-2019 18:04:12.828 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dfile.encoding=UTF8
liferay_1 | 03-Apr-2019 18:04:12.828 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.net.preferIPv4Stack=true
liferay_1 | 03-Apr-2019 18:04:12.829 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dorg.apache.catalina.loader.WebappClassLoader.ENABLE_CLEAR_REFERENCES=false
liferay_1 | 03-Apr-2019 18:04:12.829 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Duser.timezone=GMT
liferay_1 | 03-Apr-2019 18:04:12.829 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Xms4G
liferay_1 | 03-Apr-2019 18:04:12.830 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Xmx4G
liferay_1 | 03-Apr-2019 18:04:12.830 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dignore.endorsed.dirs=
liferay_1 | 03-Apr-2019 18:04:12.831 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.base=/opt/liferay/tomcat
liferay_1 | 03-Apr-2019 18:04:12.831 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.home=/opt/liferay/tomcat
liferay_1 | 03-Apr-2019 18:04:12.832 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.io.tmpdir=/opt/liferay/tomcat/temp
liferay_1 | 03-Apr-2019 18:04:12.832 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: [/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib]
liferay_1 | 03-Apr-2019 18:04:13.029 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["http-nio2-8080"]
liferay_1 | 03-Apr-2019 18:04:13.043 INFO [main] org.apache.catalina.startup.Catalina.load Initialization processed in 1012 ms
liferay_1 | 03-Apr-2019 18:04:13.095 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
liferay_1 | 03-Apr-2019 18:04:13.096 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet Engine: Apache Tomcat/9.0.6
liferay_1 | 03-Apr-2019 18:04:13.111 INFO [main] org.apache.catalina.startup.HostConfig.deployDescriptor Deploying deployment descriptor [/opt/liferay/tomcat/conf/Catalina/localhost/ROOT.xml]
liferay_1 | 03-Apr-2019 18:04:13.144 WARNING [main] org.apache.catalina.startup.HostConfig.deployDescriptor The path attribute with value [] in deployment descriptor [/opt/liferay/tomcat/conf/Catalina/localhost/ROOT.xml] has been ignored
webserver_1 | 2019-04-03 18:04:13,298 INFO spawned: 'haproxy' with pid 9
webserver_1 | 2019-04-03 18:04:13,300 INFO spawned: 'nginx' with pid 10
webserver_1 | 2019-04-03 18:04:13,302 INFO spawned: 'tusd' with pid 11
webserver_1 | Starting Liferay's NGINX Load Balancer instance.
webserver_1 |
webserver_1 | BUILD_DATE:
webserver_1 | BUILD_VCS_REF:
webserver_1 | BUILD_VERSION:
webserver_1 |
webserver_1 | ##
webserver_1 | ## Config
webserver_1 | ##
webserver_1 | 'config' directory found. The following contents are going to be copied to /etc/nginx/conf.d/location:
webserver_1 |
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : We generated two equal cookies for two different servers.
webserver_1 | Please change the secret key for 'api'.
webserver_1 | [WARNING] 092/180413 (9) : Server api/backend1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 9 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 092/180413 (9) : Server api/backend1 was DOWN and now enters maintenance (No IP for server ).
webserver_1 | [WARNING] 092/180413 (9) : Server api/backend2 is going DOWN for maintenance (No IP for server ). 8 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 092/180413 (9) : Server api/backend3 is going DOWN for maintenance (No IP for server ). 7 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 092/180413 (9) : Server api/backend4 is going DOWN for maintenance (No IP for server ). 6 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 092/180413 (9) : Server api/backend5 is going DOWN for maintenance (No IP for server ). 5 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 092/180413 (9) : Server api/backend6 is going DOWN for maintenance (No IP for server ). 4 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 092/180413 (9) : Server api/backend7 is going DOWN for maintenance (No IP for server ). 3 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 092/180413 (9) : Server api/backend8 is going DOWN for maintenance (No IP for server ). 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 092/180413 (9) : Server api/backend9 is going DOWN for maintenance (No IP for server ). 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | /wedeploy-container/config
webserver_1 | |-- common
webserver_1 | | `-- temporary-upload.conf
webserver_1 | |-- dev
webserver_1 | | `-- liferay.conf
webserver_1 | |-- prd
webserver_1 | `-- uat
webserver_1 | `-- liferay.conf
webserver_1 |
webserver_1 |
webserver_1 | ##
webserver_1 | ## Deploy
webserver_1 | ##
webserver_1 | 'deploy' directory found. The following contents are going to be copied to '/var/www/html':
webserver_1 |
webserver_1 | /wedeploy-container/deploy
webserver_1 | |-- common
webserver_1 | | `-- temporary-upload.html
webserver_1 | |-- dev
webserver_1 | |-- prd
webserver_1 | `-- uat
webserver_1 | [tusd] Using '/etc/tusd/hooks' for hooks
webserver_1 | [tusd] Using '/data/uploads' as directory storage.
webserver_1 | [tusd] Using 0.00MB as maximum size.
webserver_1 | [tusd] Using 0.0.0.0:1080 as address to listen.
webserver_1 | [tusd] Using /client-file-upload/ as the base path.
webserver_1 | [tusd] Using /metrics as the metrics path.
webserver_1 | [tusd] Core: ✓ Terminater: ✓ Finisher: ✗ Locker: ✓ GetReader: ✓ Concater: ✓ LengthDeferrer: ✓
webserver_1 |
webserver_1 |
webserver_1 | ##
webserver_1 | ## Script
webserver_1 | ##
webserver_1 | 'script' directory found. The following contents are going to be executed
webserver_1 |
webserver_1 | /wedeploy-container/script
webserver_1 | |-- commonLocal
webserver_1 | |-- dev
webserver_1 | |-- prd
webserver_1 | `-- uat
webserver_1 | Ignoring /tmp/script/*
webserver_1 |
webserver_1 |
webserver_1 |
webserver_1 | 2019-04-03 18:04:14,554 INFO success: haproxy entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
webserver_1 | 2019-04-03 18:04:14,573 INFO success: nginx entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
webserver_1 | 2019-04-03 18:04:14,573 INFO success: tusd entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
webserver_1 | [WARNING] 092/180415 (9) : Server api/backend10 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [ALERT] 092/180415 (9) : backend 'api' has no server available!
search_1 | [2019-04-03T18:04:30,094][INFO ][o.e.n.Node ] [9eaa428e25bb] initializing ...
search_1 | /usr/local/bin/entrypoint-run-elasticsearch.sh: line 7: 49 Killed ${ES_DOCKER_ENTRYPOINT} ${ES_EXTRA_ARGS}
dxpcloud_liferaygreg_search_1 exited with code 137
Stopping dxpcloud_liferaygreg_webserver_1 ... done
Stopping dxpcloud_liferaygreg_liferay_1 ... done
Stopping dxpcloud_liferaygreg_database_1 ... done
<===========--> 88% EXECUTING Aborting on container exit...
:wedeploy:startDxpCloudLocal FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':wedeploy:startDxpCloudLocal'.
> Process 'command 'docker-compose'' finished with non-zero exit value 137
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Total time: 1 mins 26.866 secs
@MEDIEVALL
Copy link
Copy Markdown

`blade gw StartDxpCloudLocal
Starting a Gradle Daemon, 1 incompatible and 1 stopped Daemons could not be reused, use --status for details
..:: Liferay Workspace for DXP Cloud generated from archetype 'lcp-dxpcloud-workspace:3.1.0' ::..
:lcp:clean
:lcp:buildDxpCloudServices
Collecting static DXP Cloud files from 'C:\Users\pavil\trainingpositiva\lcp':

  • file 'lcp\backup\LCP.json'
  • file 'lcp\ci\LCP.json'
  • file 'lcp\database\LCP.json'
  • file 'lcp\docker-compose.yml'
  • file 'lcp\liferay\config\common\com.liferay.portal.search.elasticsearch6.configuration.ElasticsearchConfiguration.config'
  • file 'lcp\liferay\config\common\portal-all.properties'
  • file 'lcp\liferay\config\dev\portal-env.properties'
  • file 'lcp\liferay\config\local\portal-env.properties'
  • file 'lcp\liferay\config\prd\portal-env.properties'
  • file 'lcp\liferay\config\uat\portal-env.properties'
  • file 'lcp\liferay\LCP.json'
  • file 'lcp\liferay\license\common\license.xml'
  • file 'lcp\search\LCP.json'
  • file 'lcp\webserver\config\dev\liferay.conf'
  • file 'lcp\webserver\config\uat\liferay.conf'
  • file 'lcp\webserver\deploy\common\temporary-upload.html'
  • file 'lcp\webserver\deploy\dev.htpasswd'
  • file 'lcp\webserver\deploy\prd.htpasswd'
  • file 'lcp\webserver\deploy\uat.htpasswd'
  • file 'lcp\webserver\LCP.json'
    :lcp:createMinimalBundleSrc
    :lcp:zipMinimalBundleToBundlesCache UP-TO-DATE
    :downloadBundle SKIPPED
    :modules:positiva-GAMADEAD-web:compileJava UP-TO-DATE
    :modules:positiva-GAMADEAD-web:buildCSS UP-TO-DATE
    :modules:positiva-GAMADEAD-web:processResources UP-TO-DATE
    :modules:positiva-GAMADEAD-web:transpileJS SKIPPED
    :modules:positiva-GAMADEAD-web:configJSModules SKIPPED
    :modules:positiva-GAMADEAD-web:replaceSoyTranslation NO-SOURCE
    :modules:positiva-GAMADEAD-web:wrapSoyAlloyTemplate SKIPPED
    :modules:positiva-GAMADEAD-web:classes UP-TO-DATE
    :modules:positiva-GAMADEAD-web:jar UP-TO-DATE
    :modules:positiva-ameloAria-web:compileJava UP-TO-DATE
    :modules:positiva-ameloAria-web:buildCSS UP-TO-DATE
    :modules:positiva-ameloAria-web:processResources UP-TO-DATE
    :modules:positiva-ameloAria-web:transpileJS SKIPPED
    :modules:positiva-ameloAria-web:configJSModules SKIPPED
    :modules:positiva-ameloAria-web:replaceSoyTranslation NO-SOURCE
    :modules:positiva-ameloAria-web:wrapSoyAlloyTemplate SKIPPED
    :modules:positiva-ameloAria-web:classes UP-TO-DATE
    :modules:positiva-ameloAria-web:jar UP-TO-DATE
    :modules:positiva-briggidtgonzalez-web:compileJava UP-TO-DATE
    :modules:positiva-briggidtgonzalez-web:buildCSS UP-TO-DATE
    :modules:positiva-briggidtgonzalez-web:processResources UP-TO-DATE
    :modules:positiva-briggidtgonzalez-web:transpileJS SKIPPED
    :modules:positiva-briggidtgonzalez-web:configJSModules SKIPPED
    :modules:positiva-briggidtgonzalez-web:replaceSoyTranslation NO-SOURCE
    :modules:positiva-briggidtgonzalez-web:wrapSoyAlloyTemplate SKIPPED
    :modules:positiva-briggidtgonzalez-web:classes UP-TO-DATE
    :modules:positiva-briggidtgonzalez-web:jar UP-TO-DATE
    :modules:positiva-dpenalosa-web:compileJava UP-TO-DATE
    :modules:positiva-dpenalosa-web:buildCSS UP-TO-DATE
    :modules:positiva-dpenalosa-web:processResources UP-TO-DATE
    :modules:positiva-dpenalosa-web:transpileJS SKIPPED
    :modules:positiva-dpenalosa-web:configJSModules SKIPPED
    :modules:positiva-dpenalosa-web:replaceSoyTranslation NO-SOURCE
    :modules:positiva-dpenalosa-web:wrapSoyAlloyTemplate SKIPPED
    :modules:positiva-dpenalosa-web:classes UP-TO-DATE
    :modules:positiva-dpenalosa-web:jar UP-TO-DATE
    :modules:positiva-dvalencia-web:compileJava UP-TO-DATE
    :modules:positiva-dvalencia-web:buildCSS NO-SOURCE
    :modules:positiva-dvalencia-web:processResources UP-TO-DATE
    :modules:positiva-dvalencia-web:transpileJS SKIPPED
    :modules:positiva-dvalencia-web:configJSModules SKIPPED
    :modules:positiva-dvalencia-web:replaceSoyTranslation NO-SOURCE
    :modules:positiva-dvalencia-web:wrapSoyAlloyTemplate SKIPPED
    :modules:positiva-dvalencia-web:classes UP-TO-DATE
    :modules:positiva-dvalencia-web:jar UP-TO-DATE
    :modules:positiva-example-web:compileJava UP-TO-DATE
    :modules:positiva-example-web:buildCSS UP-TO-DATE
    :modules:positiva-example-web:processResources UP-TO-DATE
    :modules:positiva-example-web:transpileJS SKIPPED
    :modules:positiva-example-web:configJSModules SKIPPED
    :modules:positiva-example-web:replaceSoyTranslation NO-SOURCE
    :modules:positiva-example-web:wrapSoyAlloyTemplate SKIPPED
    :modules:positiva-example-web:classes UP-TO-DATE
    :modules:positiva-example-web:jar UP-TO-DATE
    :themes:positiva-theme:createLiferayThemeJson
    :downloadNode SKIPPED
    :themes:positiva-theme:downloadNode SKIPPED
    :themes:positiva-theme:npmInstall UP-TO-DATE
    :themes:positiva-theme:gulpBuild
    [08:15:45] Using gulpfile ~\trainingpositiva\themes\positiva-theme\gulpfile.js
    [08:15:45] Starting 'build'...
    [08:15:45] Starting 'build:clean'...
    [08:15:46] Finished 'build:clean' after 1.63 s
    [08:15:46] Starting 'build:base'...
    [08:15:54] Finished 'build:base' after 7.36 s
    [08:15:54] Starting 'build:src'...
    [08:15:54] Finished 'build:src' after 41 ms
    [08:15:54] Starting 'build:web-inf'...
    [08:15:54] Finished 'build:web-inf' after 3.17 ms
    [08:15:54] Starting 'build:liferay-look-and-feel'...
    [08:15:54] Finished 'build:liferay-look-and-feel' after 140 ms
    [08:15:54] Starting 'build:hook'...
    [08:15:54] Finished 'build:hook' after 29 ms
    [08:15:54] Starting 'build:themelets'...
    [08:15:54] Starting 'build:themelet-src'...
    [08:15:54] Finished 'build:themelet-src' after 1.07 ms
    [08:15:54] Starting 'build:themelet-css-inject'...
    [08:15:55] Starting 'build:themelet-js-inject'...
    [08:15:55] gulp-inject Nothing to inject into _custom.scss.
    [08:15:55] gulp-inject Nothing to inject into portal_normal.ftl.
    [08:15:55] Finished 'build:themelet-css-inject' after 1.03 s
    [08:15:55] Finished 'build:themelet-js-inject' after 57 ms
    [08:15:55] Finished 'build:themelets' after 1.06 s
    [08:15:55] Starting 'build:rename-css-dir'...
    [08:15:55] Finished 'build:rename-css-dir' after 28 ms
    [08:15:55] Starting 'build:compile-css'...
    [08:15:55] Starting 'build:compile-lib-sass'...
    [08:16:01] Finished 'build:compile-lib-sass' after 5.96 s
    [08:16:01] Finished 'build:compile-css' after 5.96 s
    [08:16:01] Starting 'build:fix-url-functions'...
    [08:16:01] Finished 'build:fix-url-functions' after 85 ms
    [08:16:01] Starting 'build:move-compiled-css'...
    [08:16:02] Finished 'build:move-compiled-css' after 1.15 s
    [08:16:02] Starting 'build:remove-old-css-dir'...
    [08:16:02] Finished 'build:remove-old-css-dir' after 73 ms
    [08:16:02] Starting 'build:fix-at-directives'...
    [08:16:02] Finished 'build:fix-at-directives' after 11 ms
    [08:16:02] Starting 'build:r2'...
    [08:16:03] Finished 'build:r2' after 487 ms
    [08:16:03] Starting 'build:copy:fontAwesome'...
    [08:16:03] Finished 'build:copy:fontAwesome' after 144 μs
    [08:16:03] Starting 'build:war'...
    [08:16:03] Starting 'plugin:version'...
    [08:16:03] Finished 'plugin:version' after 1.61 ms
    [08:16:03] Starting 'plugin:war'...
    [08:16:04] Finished 'plugin:war' after 1.17 s
    [08:16:04] Finished 'build:war' after 1.18 s
    [08:16:04] Finished 'build' after 19 s
    :themes:positiva-theme:assemble
    :distBundle
    :lcp:copyBuiltWorkspaceModules
    Copying custom modules, themes and wars built by Liferay Workspace 'liferay' service:
  • file 'build\dist\osgi\modules\co.gov.positiva.ameloAria.web-1.0.0.jar'
  • file 'build\dist\osgi\modules\co.gov.positiva.briggidtgonzalez.web-1.0.0.jar'
  • file 'build\dist\osgi\modules\co.gov.positiva.dpenalosa.web-1.0.0.jar'
  • file 'build\dist\osgi\modules\co.gov.positiva.dvalencia.web-1.0.0.jar'
  • file 'build\dist\osgi\modules\co.gov.positiva.example.web-1.0.0.jar'
  • file 'build\dist\osgi\modules\co.gov.positiva.GAMADEAD.web-1.0.0.jar'
  • file 'build\dist\osgi\war\positiva-theme.war'
    :lcp:distLiferayCloud
    Services for DXP Cloud were built and are ready to be deployed from 'build\lcp':
    :lcp:startDxpCloudLocal
    Starting Liferay stack for DXP Cloud locally, using command:
    cmd /c docker-compose --file C:\Users\pavil\trainingpositiva\build\lcp/docker-compose.yml --project-name dxpcloud_trainingpositiva up --abort-on-container-exit --renew-anon-volumes --build --force-recreate

Recreating dxpcloudtrainingpositiva_database_1 ... done
Recreating dxpcloudtrainingpositiva_search_1 ... done
Recreating dxpcloudtrainingpositiva_liferay_1 ... done
Recreating dxpcloudtrainingpositiva_webserver_1 ... done
Attaching to dxpcloudtrainingpositiva_database_1, dxpcloudtrainingpositiva_search_1, dxpcloudtrainingpositiva_liferay_1, dxpcloudtrainingpositiva_webserver_1
database_1 | 2019-09-05T13:16:19.524946Z 0 [Warning] TIMESTAMP with implicit DEFAULT value is deprecated. Please use --explicit_defaults_for_timestamp server option (see documentation for more details).
database_1 | 2019-09-05T13:16:19.550596Z 0 [Note] mysqld (mysqld 5.7.27) starting as process 1 ...
database_1 | 2019-09-05T13:16:19.621528Z 0 [Note] InnoDB: PUNCH HOLE support available
search_1 | Starting Elasticsearch instance.
database_1 | 2019-09-05T13:16:19.621800Z 0 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins
liferay_1 | Starting Liferay DXP instance.
search_1 |
database_1 | 2019-09-05T13:16:19.622012Z 0 [Note] InnoDB: Uses event mutexes
liferay_1 |
search_1 | BUILD_DATE: 2019-07-23T06:06:48Z
database_1 | 2019-09-05T13:16:19.622263Z 0 [Note] InnoDB: GCC builtin __atomic_thread_fence() is used for memory barrier
liferay_1 | LIFERAY_HOME: /opt/liferay
search_1 | BUILD_VCS_REF: 84556edb66347b6d7375c07e4928103b0339a499
database_1 | 2019-09-05T13:16:19.622499Z 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
database_1 | 2019-09-05T13:16:19.622597Z 0 [Note] InnoDB: Using Linux native AIO
liferay_1 | BUILD_DATE: 2019-08-23T16:32:23Z
search_1 | BUILD_VERSION: 6.1.4-3.0.3
liferay_1 | BUILD_VCS_REF: aa5ac14db5e5800810b77bdb15a6070d70c7c395
search_1 | CLUSTER_NAME: liferay_cluster
search_1 | NODE_MASTER: true
database_1 | 2019-09-05T13:16:19.670007Z 0 [Note] InnoDB: Number of pools: 1
search_1 | HOSTNAME: 36cc37f26d04
search_1 | NODE_DATA: true
search_1 | NODE_INGEST: true
search_1 | PING_UNICAST_HOST: search
search_1 | MAX_LOCAL_STORAGE_NODES: 1
search_1 | NETWORK_HOST: site
search_1 | MEMORY_LOCK: false
search_1 | NUMBER_OF_MASTERS: 1
search_1 | ENABLE_XPACK_SECURITY: false
search_1 |
search_1 | ##
search_1 | ## Config
liferay_1 | BUILD_VERSION: 7.2.10-ga1-3.0.10
liferay_1 |
liferay_1 | ##
liferay_1 | ## Config
liferay_1 | ##
liferay_1 | 'config' directory found. The following contents are going to be copied to /opt/liferay and/or /opt/liferay/osgi/configs:
liferay_1 |
liferay_1 | /lcp-container/config
liferay_1 | ??? common
liferay_1 | ??? ??? com.liferay.portal.search.elasticsearch6.configuration.ElasticsearchConfiguration.config
liferay_1 | ??? ??? portal-all.properties
liferay_1 | ??? dev
liferay_1 | ??? ??? portal-env.properties
liferay_1 | ??? local
liferay_1 | ??? ??? portal-env.properties
liferay_1 | ??? prd
liferay_1 | ??? ??? portal-env.properties
liferay_1 | ??? uat
liferay_1 | ??? portal-env.properties
liferay_1 |
liferay_1 |
liferay_1 | ##
liferay_1 | ## Deploy
liferay_1 | ##
liferay_1 | 'deploy' directory found. The content below will be copied according to the following:
liferay_1 | .jar files will be copied to /opt/liferay/osgi/modules
liferay_1 | .lpkg files will be copied to /opt/liferay/osgi/marketplace
liferay_1 | .war files will be copied to /opt/liferay/osgi/war
liferay_1 |
liferay_1 | /lcp-container/deploy
liferay_1 | ??? common
liferay_1 | ??? ??? co.gov.positiva.ameloAria.web-1.0.0.jar
liferay_1 | ??? ??? co.gov.positiva.briggidtgonzalez.web-1.0.0.jar
liferay_1 | ??? ??? co.gov.positiva.dpenalosa.web-1.0.0.jar
liferay_1 | ??? ??? co.gov.positiva.dvalencia.web-1.0.0.jar
liferay_1 | ??? ??? co.gov.positiva.example.web-1.0.0.jar
liferay_1 | ??? ??? co.gov.positiva.GAMADEAD.web-1.0.0.jar
liferay_1 | ??? ??? positiva-theme.war
liferay_1 | ??? dev
liferay_1 | ??? prd
liferay_1 | ??? uat
liferay_1 |
liferay_1 |
search_1 | ##
database_1 | 2019-09-05T13:16:19.788665Z 0 [Note] InnoDB: Using CPU crc32 instructions
database_1 | 2019-09-05T13:16:19.852462Z 0 [Note] InnoDB: Initializing buffer pool, total size = 128M, instances = 1, chunk size = 128M
database_1 | 2019-09-05T13:16:19.982039Z 0 [Note] InnoDB: Completed initialization of buffer pool
database_1 | 2019-09-05T13:16:20.056065Z 0 [Note] InnoDB: If the mysqld execution user is authorized, page cleaner thread priority can be changed. See the man page of setpriority().
database_1 | 2019-09-05T13:16:20.307179Z 0 [Note] InnoDB: Highest supported file format is Barracuda.
database_1 | 2019-09-05T13:16:21.748738Z 0 [Note] InnoDB: Creating shared tablespace for temporary tables
database_1 | 2019-09-05T13:16:21.748921Z 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
database_1 | 2019-09-05T13:16:22.412285Z 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
database_1 | 2019-09-05T13:16:22.413402Z 0 [Note] InnoDB: 96 redo rollback segment(s) found. 96 redo rollback segment(s) are active.
database_1 | 2019-09-05T13:16:22.413419Z 0 [Note] InnoDB: 32 non-redo rollback segment(s) are active.
database_1 | 2019-09-05T13:16:22.414630Z 0 [Note] InnoDB: 5.7.27 started; log sequence number 1328874
database_1 | 2019-09-05T13:16:22.416115Z 0 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool
database_1 | 2019-09-05T13:16:22.422402Z 0 [Note] Plugin 'FEDERATED' is disabled.
database_1 | 2019-09-05T13:16:23.306345Z 0 [Warning] Failed to set up SSL because of the following SSL library error: SSL context is not usable without certificate and private key
database_1 | 2019-09-05T13:16:23.307023Z 0 [Note] Server hostname (bind-address): ''; port: 3306
database_1 | 2019-09-05T13:16:23.325405Z 0 [Note] IPv6 is available.
database_1 | 2019-09-05T13:16:23.359236Z 0 [Note] - '::' resolves to '::';
database_1 | 2019-09-05T13:16:23.359816Z 0 [Note] Server socket created on IP: '::'.
database_1 | 2019-09-05T13:16:23.409686Z 0 [Note] InnoDB: Buffer pool(s) load completed at 190905 13:16:23
liferay_1 | ##
liferay_1 | ## License
liferay_1 | ##
liferay_1 | 'license' directory found. The following contents are going to be copied to /opt/liferay/deploy and/or /opt/liferay/data:
liferay_1 |
liferay_1 | /lcp-container/license
liferay_1 | ??? common
liferay_1 | ??? ??? license.xml
liferay_1 | ??? dev
liferay_1 | ??? prd
liferay_1 | ??? uat
liferay_1 |
liferay_1 |
liferay_1 | ##
liferay_1 | ## Hot fix
liferay_1 | ##
liferay_1 | No 'hotfix' directory found. If you wish to apply hotfixes to Liferay make sure
liferay_1 | to drop your hotfixes (zip) files in the 'hotfix' directory.
liferay_1 |
liferay_1 |
liferay_1 | ##
liferay_1 | ## Cluster
liferay_1 | ##
liferay_1 | Cluster not enabled. If you want to enable cluster please set the environment variable LCP_PROJECT_LIFERAY_CLUSTER_ENABLED to true
liferay_1 |
liferay_1 |
liferay_1 | ##
liferay_1 | ## Monitor
liferay_1 | ##
liferay_1 | Monitor is not enabled. If you want to enable please set the environment variables
liferay_1 | LCP_PROJECT_MONITOR_DYNATRACE_TENANT and LCP_PROJECT_MONITOR_DYNATRACE_TOKEN.
liferay_1 |
liferay_1 |
liferay_1 | ##
liferay_1 | ## Script
liferay_1 | ##
liferay_1 | 'script' directory found. The following contents are going to be executed
liferay_1 |
liferay_1 | /lcp-container/script
liferay_1 | ??? common
liferay_1 | ??? dev
liferay_1 | ??? prd
liferay_1 | ??? uat
liferay_1 | Ignoring /tmp/script/

liferay_1 |
liferay_1 |
search_1 | 'config' directory found. The following contents are going to be copied to /usr/share/elasticsearch/config :
search_1 |
search_1 | /lcp-container/config
search_1 | -- common search_1 | search_1 | search_1 | ## search_1 | ## License search_1 | ## search_1 | 'license' directory found. The following licenses will be put into elasticsearch: search_1 | search_1 | /lcp-container/license search_1 | -- common
search_1 | ls: cannot access /tmp/license/*.json: No such file or directory
search_1 |
search_1 |
search_1 | ##
search_1 | ## Script
search_1 | ##
search_1 | 'script' directory found. The following contents are going to be executed
search_1 |
search_1 | /lcp-container/script
search_1 | -- common search_1 | Ignoring /tmp/script/* search_1 | search_1 | search_1 | search_1 | ## search_1 | ## deploy search_1 | ## search_1 | No 'deploy' directory found. If you wish to install additional Elasticsearch search_1 | plugins, make sure to drop your *.plugins files into 'deploy' directory. search_1 | database_1 | 2019-09-05T13:16:23.501619Z 0 [Warning] Insecure configuration for --pid-file: Location '/var/run/mysqld' in the path is accessible to all OS users. Consider choosing a different directory. database_1 | 2019-09-05T13:16:25.437263Z 0 [Note] Event Scheduler: Loaded 0 events webserver_1 | 2019-09-05 13:16:28,576 CRIT Supervisor running as root (no user in config file) webserver_1 | 2019-09-05 13:16:28,671 CRIT Server 'unix_http_server' running without any HTTP authentication checking webserver_1 | 2019-09-05 13:16:28,703 INFO supervisord started with pid 13 webserver_1 | 2019-09-05 13:16:29,775 INFO spawned: 'haproxy' with pid 16 liferay_1 | liferay_1 | OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000b0000000, 1342177280, 0) failed; error='Cannot allocate memory' (errno=12) liferay_1 | # liferay_1 | # There is insufficient memory for the Java Runtime Environment to continue. liferay_1 | # Native memory allocation (mmap) failed to map 1342177280 bytes for committing reserved memory. liferay_1 | # An error report file with more information is saved as: webserver_1 | 2019-09-05 13:16:29,829 INFO spawned: 'nginx' with pid 17 webserver_1 | Starting Liferay's NGINX Load Balancer instance. webserver_1 | webserver_1 | BUILD_DATE: webserver_1 | BUILD_VCS_REF: webserver_1 | BUILD_VERSION: webserver_1 | webserver_1 | ## webserver_1 | ## Config webserver_1 | ## webserver_1 | 'config' directory found. The following contents are going to be copied to /etc/nginx/conf.d/location: webserver_1 | liferay_1 | # /opt/liferay/hs_err_pid1.log webserver_1 | /lcp-container/config webserver_1 | |-- common webserver_1 | |-- dev webserver_1 | | -- liferay.conf
webserver_1 | |-- prd
webserver_1 | -- uat webserver_1 | -- liferay.conf
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend1 is going DOWN for maintenance. 9 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend2 is going DOWN for maintenance. 8 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend3 is going DOWN for maintenance. 7 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend4 is going DOWN for maintenance. 6 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend5 is going DOWN for maintenance. 5 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend6 is going DOWN for maintenance. 4 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend7 is going DOWN for maintenance. 3 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend8 is going DOWN for maintenance. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend9 is going DOWN for maintenance. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend10 is going DOWN for maintenance. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [ALERT] 247/131630 (16) : backend 'api' has no server available!
webserver_1 | [WARNING] 247/131630 (16) : api/backend1 changed its IP from to 172.18.0.4 by docker/default.
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend1 administratively READY thanks to valid DNS answer.
webserver_1 | [WARNING] 247/131630 (16) : Server api/backend1 ('liferay') is UP/READY (resolves again).
webserver_1 |
webserver_1 |
webserver_1 | 2019-09-05 13:16:33,146 INFO success: haproxy entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
webserver_1 | 2019-09-05 13:16:33,147 INFO success: nginx entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
webserver_1 | [WARNING] 247/131632 (16) : Server api/backend1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
webserver_1 | [ALERT] 247/131632 (16) : backend 'api' has no server available!
webserver_1 | ##
webserver_1 | ## Deploy
webserver_1 | ##
webserver_1 | 'deploy' directory found. The following contents are going to be copied to '/var/www/html':
webserver_1 |
webserver_1 | /lcp-container/deploy
webserver_1 | |-- common
webserver_1 | | -- temporary-upload.html webserver_1 | |-- dev webserver_1 | |-- prd webserver_1 | -- uat
webserver_1 |
webserver_1 |
webserver_1 | ##
webserver_1 | ## Script
webserver_1 | ##
webserver_1 | 'script' directory found. The following contents are going to be executed
webserver_1 |
webserver_1 | /lcp-container/script
webserver_1 | |-- common
webserver_1 | |-- dev
webserver_1 | |-- prd
webserver_1 | `-- uat
dxpcloudtrainingpositiva_liferay_1 exited with code 1
Stopping dxpcloudtrainingpositiva_webserver_1 ... done
Stopping dxpcloudtrainingpositiva_search_1 ... done
Stopping dxpcloudtrainingpositiva_database_1 ... done
Aborting on container exit...
:lcp:startDxpCloudLocal FAILED

FAILURE: Build failed with an exception.

  • What went wrong:
    Execution failed for task ':lcp:startDxpCloudLocal'.

Process 'command 'cmd'' finished with non-zero exit value 1

  • Try:
    Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.

BUILD FAILED

Total time: 1 mins 40.583 secs`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment