Skip to content

Instantly share code, notes, and snippets.

@amitkumarj441
Last active June 22, 2017 13:26
Show Gist options
  • Select an option

  • Save amitkumarj441/01a0fcb82eb5cfeb3fc6003a592a3070 to your computer and use it in GitHub Desktop.

Select an option

Save amitkumarj441/01a0fcb82eb5cfeb3fc6003a592a3070 to your computer and use it in GitHub Desktop.
Pod description
[root@viaq openshift-ansible]# oc describe pod logging-kibana-1-xk06c
Name: logging-kibana-1-xk06c
Namespace: logging
Security Policy: restricted
Node: viaq.logging.test/172.16.93.5
Start Time: Thu, 22 Jun 2017 13:08:55 +0000
Labels: component=kibana
deployment=logging-kibana-1
deploymentconfig=logging-kibana
logging-infra=kibana
provider=openshift
Status: Terminating (expires Thu, 22 Jun 2017 13:19:26 +0000)
Termination Grace Period: 30s
IP: 10.128.0.10
Controllers: ReplicationController/logging-kibana-1
Containers:
kibana:
Container ID:
Image: docker.io/openshift/origin-logging-kibana:v1.5.1
Image ID:
Port:
Limits:
memory: 736Mi
Requests:
memory: 736Mi
State: Waiting
Reason: ErrImagePull
Ready: False
Restart Count: 0
Volume Mounts:
/etc/kibana/keys from kibana (ro)
/var/run/secrets/kubernetes.io/serviceaccount from aggregated-logging-kibana-token-xhjtf (ro)
Environment Variables:
ES_HOST: logging-es
ES_PORT: 9200
KIBANA_MEMORY_LIMIT: 771751936 (limits.memory)
kibana-proxy:
Container ID:
Image: docker.io/openshift/origin-logging-auth-proxy:v1.5.1
Image ID:
Port: 3000/TCP
Limits:
memory: 96Mi
Requests:
memory: 96Mi
State: Waiting
Reason: ErrImagePull
Ready: False
Restart Count: 0
Volume Mounts:
/secret from kibana-proxy (ro)
/var/run/secrets/kubernetes.io/serviceaccount from aggregated-logging-kibana-token-xhjtf (ro)
Environment Variables:
OAP_BACKEND_URL: http://localhost:5601
OAP_AUTH_MODE: oauth2
OAP_TRANSFORM: user_header,token_header
OAP_OAUTH_ID: kibana-proxy
OAP_MASTER_URL: https://kubernetes.default.svc.cluster.local
OAP_PUBLIC_MASTER_URL: https://viaq.logging.test:8443
OAP_LOGOUT_REDIRECT: https://viaq.logging.test:8443/console/logout
OAP_MASTER_CA_FILE: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
OAP_DEBUG: False
OAP_OAUTH_SECRET_FILE: /secret/oauth-secret
OAP_SERVER_CERT_FILE: /secret/server-cert
OAP_SERVER_KEY_FILE: /secret/server-key
OAP_SERVER_TLS_FILE: /secret/server-tls.json
OAP_SESSION_SECRET_FILE: /secret/session-secret
OCP_AUTH_PROXY_MEMORY_LIMIT: 100663296 (limits.memory)
Conditions:
Type Status
Initialized True
Ready False
PodScheduled True
Volumes:
kibana:
Type: Secret (a volume populated by a Secret)
SecretName: logging-kibana
kibana-proxy:
Type: Secret (a volume populated by a Secret)
SecretName: logging-kibana-proxy
aggregated-logging-kibana-token-xhjtf:
Type: Secret (a volume populated by a Secret)
SecretName: aggregated-logging-kibana-token-xhjtf
QoS Class: Burstable
Tolerations: <none>
Events:
FirstSeen LastSeen Count From SubObjectPath Type Reason Message
--------- -------- ----- ---- ------------- -------- ------ -------
14m 14m 1 {default-scheduler } Normal Scheduled Successfully assigned logging-kibana-1-xk06c to viaq.logging.test
13m 13m 1 {kubelet viaq.logging.test} spec.containers{kibana-proxy} Warning Failed Failed to pull image "docker.io/openshift/origin-logging-auth-proxy:v1.5.1": image pull failed for docker.io/openshift/origin-logging-auth-proxy:v1.5.1, this may be because there are no credentials on this request. details: (Network timed out while trying to connect to https://index.docker.io/v1/repositories/openshift/origin-logging-auth-proxy/images. You may want to check your internet connection or if you are behind a proxy.)
13m 13m 1 {kubelet viaq.logging.test} Warning FailedSync Error syncing pod, skipping: [failed to "StartContainer" for "kibana-proxy" with ErrImagePull: "image pull failed for docker.io/openshift/origin-logging-auth-proxy:v1.5.1, this may be because there are no credentials on this request. details: (Network timed out while trying to connect to https://index.docker.io/v1/repositories/openshift/origin-logging-auth-proxy/images. You may want to check your internet connection or if you are behind a proxy.)"
, failed to "StartContainer" for "kibana" with ErrImagePull: "image pull failed for docker.io/openshift/origin-logging-kibana:v1.5.1, this may be because there are no credentials on this request. details: (Tag v1.5.1 not found in repository docker.io/openshift/origin-logging-kibana)"
]
13m 13m 1 {kubelet viaq.logging.test} spec.containers{kibana} Normal BackOff Back-off pulling image "docker.io/openshift/origin-logging-kibana:v1.5.1"
12m 12m 1 {kubelet viaq.logging.test} Warning FailedSync Error syncing pod, skipping: [failed to "StartContainer" for "kibana" with ImagePullBackOff: "Back-off pulling image \"docker.io/openshift/origin-logging-kibana:v1.5.1\""
, failed to "StartContainer" for "kibana-proxy" with ErrImagePull: "image pull failed for docker.io/openshift/origin-logging-auth-proxy:v1.5.1, this may be because there are no credentials on this request. details: (net/http: request canceled)"
]
12m 9m 2 {kubelet viaq.logging.test} spec.containers{kibana-proxy} Warning Failed Failed to pull image "docker.io/openshift/origin-logging-auth-proxy:v1.5.1": image pull failed for docker.io/openshift/origin-logging-auth-proxy:v1.5.1, this may be because there are no credentials on this request. details: (net/http: request canceled)
13m 9m 3 {kubelet viaq.logging.test} spec.containers{kibana} Normal Pulling pulling image "docker.io/openshift/origin-logging-kibana:v1.5.1"
9m 9m 1 {kubelet viaq.logging.test} Warning FailedSync Error syncing pod, skipping: [failed to "StartContainer" for "kibana" with ErrImagePull: "image pull failed for docker.io/openshift/origin-logging-kibana:v1.5.1, this may be because there are no credentials on this request. details: (Tag v1.5.1 not found in repository docker.io/openshift/origin-logging-kibana)"
, failed to "StartContainer" for "kibana-proxy" with ErrImagePull: "image pull failed for docker.io/openshift/origin-logging-auth-proxy:v1.5.1, this may be because there are no credentials on this request. details: (net/http: request canceled)"
]
13m 8m 3 {kubelet viaq.logging.test} spec.containers{kibana} Warning Failed Failed to pull image "docker.io/openshift/origin-logging-kibana:v1.5.1": image pull failed for docker.io/openshift/origin-logging-kibana:v1.5.1, this may be because there are no credentials on this request. details: (Tag v1.5.1 not found in repository docker.io/openshift/origin-logging-kibana)
14m 8m 4 {kubelet viaq.logging.test} spec.containers{kibana-proxy} Normal Pulling pulling image "docker.io/openshift/origin-logging-auth-proxy:v1.5.1"
[root@viaq openshift-ansible]#
NAME READY STATUS RESTARTS AGE
logging-curator-1-hklth 0/1 ImagePullBackOff 0 9m
logging-es-7vo926zw-1-2txlq 0/1 ImagePullBackOff 0 9m
logging-es-7vo926zw-1-deploy 1/1 Running 0 13m
logging-fluentd-024kj 1/1 Running 0 13m
logging-kibana-1-deploy 1/1 Running 0 13m
logging-kibana-1-xk06c 0/2 ErrImagePull 0 9m
[root@viaq openshift-ansible]# oc describe pod
pod poddisruptionbudget
[root@viaq openshift-ansible]# oc describe pod logging-curator-1-hklth
Name: logging-curator-1-hklth
Namespace: logging
Security Policy: restricted
Node: viaq.logging.test/172.16.93.5
Start Time: Thu, 22 Jun 2017 13:08:55 +0000
Labels: component=curator
deployment=logging-curator-1
deploymentconfig=logging-curator
logging-infra=curator
provider=openshift
Status: Running
IP: 10.128.0.11
Controllers: ReplicationController/logging-curator-1
Containers:
curator:
Container ID: docker://8e860285e1cbd3647ed5fdd450549d5bb16d3212db43f33f350a8fd664899438
Image: docker.io/openshift/origin-logging-curator:v1.5.1
Image ID: docker-pullable://docker.io/openshift/origin-logging-curator@sha256:72f1e279da63531941978d98f1cea7cbce6be4a935ce986a229f436ffa03697d
Port:
Limits:
cpu: 100m
Requests:
cpu: 100m
State: Waiting
Reason: ImagePullBackOff
Last State: Terminated
Reason: Error
Exit Code: 255
Started: Thu, 22 Jun 2017 13:14:35 +0000
Finished: Thu, 22 Jun 2017 13:17:34 +0000
Ready: False
Restart Count: 0
Volume Mounts:
/etc/curator/keys from certs (ro)
/etc/curator/settings from config (ro)
/var/run/secrets/kubernetes.io/serviceaccount from aggregated-logging-curator-token-5g95b (ro)
Environment Variables:
K8S_HOST_URL: https://kubernetes.default.svc.cluster.local
ES_HOST: logging-es
ES_PORT: 9200
ES_CLIENT_CERT: /etc/curator/keys/cert
ES_CLIENT_KEY: /etc/curator/keys/key
ES_CA: /etc/curator/keys/ca
CURATOR_DEFAULT_DAYS: 30
CURATOR_RUN_HOUR: 0
CURATOR_RUN_MINUTE: 0
CURATOR_RUN_TIMEZONE: UTC
CURATOR_SCRIPT_LOG_LEVEL: INFO
CURATOR_LOG_LEVEL: ERROR
Conditions:
Type Status
Initialized True
Ready False
PodScheduled True
Volumes:
certs:
Type: Secret (a volume populated by a Secret)
SecretName: logging-curator
config:
Type: ConfigMap (a volume populated by a ConfigMap)
Name: logging-curator
aggregated-logging-curator-token-5g95b:
Type: Secret (a volume populated by a Secret)
SecretName: aggregated-logging-curator-token-5g95b
QoS Class: Burstable
Tolerations: <none>
Events:
FirstSeen LastSeen Count From SubObjectPath Type Reason Message
--------- -------- ----- ---- ------------- -------- ------ -------
10m 10m 1 {default-scheduler } Normal Scheduled Successfully assigned logging-curator-1-hklth to viaq.logging.test
8m 8m 1 {kubelet viaq.logging.test} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "curator" with ErrImagePull: "image pull failed for docker.io/openshift/origin-logging-curator:v1.5.1, this may be because there are no credentials on this request. details: (net/http: request canceled)"
8m 8m 1 {kubelet viaq.logging.test} spec.containers{curator} Warning Failed Failed to pull image "docker.io/openshift/origin-logging-curator:v1.5.1": image pull failed for docker.io/openshift/origin-logging-curator:v1.5.1, this may be because there are no credentials on this request. details: (net/http: request canceled)
4m 4m 1 {kubelet viaq.logging.test} spec.containers{curator} Normal Pulled Successfully pulled image "docker.io/openshift/origin-logging-curator:v1.5.1"
4m 4m 1 {kubelet viaq.logging.test} spec.containers{curator} Normal Created Created container with docker id 8e860285e1cb; Security:[seccomp=unconfined]
4m 4m 1 {kubelet viaq.logging.test} spec.containers{curator} Normal Started Started container with docker id 8e860285e1cb
9m 1m 2 {kubelet viaq.logging.test} spec.containers{curator} Warning Failed Failed to pull image "docker.io/openshift/origin-logging-curator:v1.5.1": image pull failed for docker.io/openshift/origin-logging-curator:v1.5.1, this may be because there are no credentials on this request. details: (pinging docker registry returned: Get https://registry-1.docker.io/v2/: net/http: TLS handshake timeout)
9m 1m 2 {kubelet viaq.logging.test} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "curator" with ErrImagePull: "image pull failed for docker.io/openshift/origin-logging-curator:v1.5.1, this may be because there are no credentials on this request. details: (pinging docker registry returned: Get https://registry-1.docker.io/v2/: net/http: TLS handshake timeout)"
9m 23s 5 {kubelet viaq.logging.test} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "curator" with ImagePullBackOff: "Back-off pulling image \"docker.io/openshift/origin-logging-curator:v1.5.1\""
9m 23s 5 {kubelet viaq.logging.test} spec.containers{curator} Normal BackOff Back-off pulling image "docker.io/openshift/origin-logging-curator:v1.5.1"
10m 11s 5 {kubelet viaq.logging.test} spec.containers{curator} Normal Pulling pulling image "docker.io/openshift/origin-logging-curator:v1.5.1"
[root@viaq openshift-ansible]#
[root@viaq openshift-ansible]# oc get pods
NAME READY STATUS RESTARTS AGE
logging-curator-1-hklth 1/1 Running 1 11m
logging-es-7vo926zw-1-deploy 0/1 Error 0 15m
logging-fluentd-024kj 1/1 Running 0 15m
logging-kibana-1-deploy 0/1 Error 0 15m
logging-kibana-1-xk06c 0/2 Terminating 0 11m
[root@viaq openshift-ansible]#
[root@viaq openshift-ansible]# oc describe pod logging-es-7vo926zw-1-deploy
Name: logging-es-7vo926zw-1-deploy
Namespace: logging
Security Policy: restricted
Node: viaq.logging.test/172.16.93.5
Start Time: Thu, 22 Jun 2017 13:05:05 +0000
Labels: openshift.io/deployer-pod-for.name=logging-es-7vo926zw-1
Status: Failed
IP: 10.128.0.6
Controllers: <none>
Containers:
deployment:
Container ID: docker://cba0ef48b0499c45668fbbea97784437b6804adf59f2d91b56c56d0b75cd583c
Image: openshift/origin-deployer:v1.5.1
Image ID: docker-pullable://docker.io/openshift/origin-deployer@sha256:77ac551235d8edf43ccb2fbd8fa5384ad9d8b94ba726f778fced18710c5f74f0
Port:
State: Terminated
Reason: Error
Exit Code: 1
Started: Thu, 22 Jun 2017 13:08:55 +0000
Finished: Thu, 22 Jun 2017 13:18:56 +0000
Ready: False
Restart Count: 0
Volume Mounts:
/var/run/secrets/kubernetes.io/serviceaccount from deployer-token-qmqgn (ro)
Environment Variables:
KUBERNETES_MASTER: https://viaq.logging.test:8443
OPENSHIFT_MASTER: https://viaq.logging.test:8443
BEARER_TOKEN_FILE: /var/run/secrets/kubernetes.io/serviceaccount/token
OPENSHIFT_CA_DATA: -----BEGIN CERTIFICATE-----
MIIC6jCCAdKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAmMSQwIgYDVQQDDBtvcGVu
c2hpZnQtc2lnbmVyQDE0OTgxMzM5NjUwHhcNMTcwNjIyMTIxOTI0WhcNMjIwNjIx
MTIxOTI1WjAmMSQwIgYDVQQDDBtvcGVuc2hpZnQtc2lnbmVyQDE0OTgxMzM5NjUw
ggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDJmPL57vOyay5sadvxhCa6
2xJ1Eks0UhdEp1wVKblcbtogXHGFzDC8eTdYNpRMtrkcJiqZyhciTZt5fihM2GEO
Mf6n9ebLN7/Qf7pwETiFNUVffUCzkf6dicIXTvLVnMKFgX8dhNVxbO3CH/2kb14r
8FssYws+165gRrhVxRK+ovdTl9NVfDXHqRxaehxrgFhxG0gPrEDAu86CcEgk557w
M3F07+GI+r4oAiYgv/uoURjbQOJpx/fzHXavNFmbV4pKZZW+3DVfHjhk3+l2+jrt
wz40A3znNmRs4Dv1IMU5EP0QuiiOZZETrOMonMLWjGxD2T9sJ04ox5zUS1WQNxXB
AgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqG
SIb3DQEBCwUAA4IBAQBtVSCPOyK/+CUVSe/JGwe+fv1se3IDkqYBMbbURO6TUJsx
UWJkioEMiLf8DZOdnYfvJb6XPXNlyMTwLFfJWh1GHBHFjVvLaxIcHoqpxPq/Tw6q
IG+K/bJvszdd0Ph47lOP6ks92gzOY4x5NgW91mjtRfnmGEDNQgbsA3uAwICWpewD
MuOVfxNT6aYD98h3LqyQa97C4NSqhGekDC5FV7mJBHp2uzKpE1g1Cvp2aFzShxKo
sdJDSKZ5ieKF8qvvaxDPpNKmx2hyb9911rIdN2Antn/2x7622tqZB2U64uGZV5Rr
aSN4X5gNDlVoo70e++qRLMKR+VlO3QTTk0jVPFRT
-----END CERTIFICATE-----
OPENSHIFT_DEPLOYMENT_NAME: logging-es-7vo926zw-1
OPENSHIFT_DEPLOYMENT_NAMESPACE: logging
Conditions:
Type Status
Initialized True
Ready False
PodScheduled True
Volumes:
deployer-token-qmqgn:
Type: Secret (a volume populated by a Secret)
SecretName: deployer-token-qmqgn
QoS Class: BestEffort
Tolerations: <none>
Events:
FirstSeen LastSeen Count From SubObjectPath Type Reason Message
--------- -------- ----- ---- ------------- -------- ------ -------
16m 16m 1 {default-scheduler } Normal Scheduled Successfully assigned logging-es-7vo926zw-1-deploy to viaq.logging.test
15m 15m 1 {kubelet viaq.logging.test} spec.containers{deployment} Warning Failed Failed to pull image "openshift/origin-deployer:v1.5.1": net/http: request canceled
15m 15m 1 {kubelet viaq.logging.test} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "deployment" with ErrImagePull: "net/http: request canceled"
14m 14m 1 {kubelet viaq.logging.test} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "deployment" with ErrImagePull: "Get https://registry-1.docker.io/v1/repositories/openshift/origin-deployer/tags/v1.5.1: net/http: TLS handshake timeout"
14m 14m 1 {kubelet viaq.logging.test} spec.containers{deployment} Warning Failed Failed to pull image "openshift/origin-deployer:v1.5.1": Get https://registry-1.docker.io/v1/repositories/openshift/origin-deployer/tags/v1.5.1: net/http: TLS handshake timeout
15m 14m 2 {kubelet viaq.logging.test} spec.containers{deployment} Normal BackOff Back-off pulling image "openshift/origin-deployer:v1.5.1"
15m 14m 2 {kubelet viaq.logging.test} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "deployment" with ImagePullBackOff: "Back-off pulling image \"openshift/origin-deployer:v1.5.1\""
16m 14m 3 {kubelet viaq.logging.test} spec.containers{deployment} Normal Pulling pulling image "openshift/origin-deployer:v1.5.1"
12m 12m 1 {kubelet viaq.logging.test} spec.containers{deployment} Normal Pulled Successfully pulled image "openshift/origin-deployer:v1.5.1"
12m 12m 1 {kubelet viaq.logging.test} spec.containers{deployment} Normal Created Created container with docker id cba0ef48b049; Security:[seccomp=unconfined]
12m 12m 1 {kubelet viaq.logging.test} spec.containers{deployment} Normal Started Started container with docker id cba0ef48b049
[root@viaq openshift-ansible]#
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment