Tuesday, October 17, 2017

enable debug logging for HDFS in bash

Recently i started facing issue in executing hdfs command in my Linux server.

After i did google search i end up with below environment variable.



export HADOOP_ROOT_LOGGER=DEBUG,console

export HADOOP_DATANODE_OPTS=”${HADOOP_DATANODE_OPTS} -Dhadoop.root.logger=DEBUG,DRFA”

export HADOOP_NAMENODE_OPTS="${HADOOP_NAMENODE_OPTS} -Dhadoop.root.logger=DEBUG,DRFA" 


i was able to debug with first environment variable itself.


[hdfs@datanode1:[PTA] ~]$ hdfs dfs -ls /
17/10/17 15:24:08 DEBUG util.Shell: setsid exited with exit code 0
17/10/17 15:24:08 DEBUG conf.Configuration: parsing URL jar:file:/opt/cloudera/parcels/CDH-5.8.4-1.cdh5.8.4.p0.5/jars/hadoop-common-2.6.0-cdh5.8.4.jar!/core-default.xml
17/10/17 15:24:08 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@25c6ca49
17/10/17 15:24:08 DEBUG conf.Configuration: parsing URL file:/etc/hadoop/conf.cloudera.yarn/core-site.xml
17/10/17 15:24:08 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@1412504b
17/10/17 15:24:08 DEBUG core.Tracer: sampler.classes = ; loaded no samplers
17/10/17 15:24:08 DEBUG core.Tracer: span.receiver.classes = ; loaded no span receivers
17/10/17 15:24:08 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/10/17 15:24:08 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
17/10/17 15:24:08 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
17/10/17 15:24:08 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Renewal failures since startup], about=, type=DEFAULT, always=false, sampleName=Ops)
17/10/17 15:24:08 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Renewal failures since last successful login], about=, type=DEFAULT, always=false, sampleName=Ops)
17/10/17 15:24:08 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
17/10/17 15:24:09 DEBUG security.SecurityUtil: Setting hadoop.security.token.service.use_ip to true
17/10/17 15:24:09 DEBUG security.Groups:  Creating new Groups object
17/10/17 15:24:09 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000
17/10/17 15:24:09 DEBUG security.UserGroupInformation: hadoop login
17/10/17 15:24:09 DEBUG security.UserGroupInformation: hadoop login commit
17/10/17 15:24:09 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hdfs
17/10/17 15:24:09 DEBUG security.UserGroupInformation: Using user: "UnixPrincipal: hdfs" with name hdfs
17/10/17 15:24:09 DEBUG security.UserGroupInformation: User entry: "hdfs"
17/10/17 15:24:09 DEBUG security.UserGroupInformation: Assuming keytab is managed externally since logged in from subject.
17/10/17 15:24:09 DEBUG security.UserGroupInformation: UGI loginUser:hdfs (auth:SIMPLE)
17/10/17 15:24:09 DEBUG core.Tracer: sampler.classes = ; loaded no samplers
17/10/17 15:24:09 DEBUG core.Tracer: span.receiver.classes = ; loaded no span receivers
17/10/17 15:24:09 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
17/10/17 15:24:09 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
17/10/17 15:24:09 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
17/10/17 15:24:09 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path = /var/run/hdfs-sockets/dn
17/10/17 15:24:09 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
17/10/17 15:24:09 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@62e98ccf
17/10/17 15:24:09 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@38d1b194
17/10/17 15:24:09 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
17/10/17 15:24:09 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library
17/10/17 15:24:09 DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$2@350599cc: starting with interruptCheckPeriodMs = 60000
17/10/17 15:24:09 DEBUG util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled.
17/10/17 15:24:09 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
17/10/17 15:24:09 DEBUG ipc.Client: The ping interval is 60000 ms.
17/10/17 15:24:09 DEBUG ipc.Client: Connecting to namenode.tanu.com/169.35.91.237:8020
17/10/17 15:24:09 DEBUG ipc.Client: IPC Client (16613795) connection to namenode.tanu.com/169.35.91.237:8020 from hdfs: starting, having connections 1
17/10/17 15:24:09 DEBUG ipc.Client: IPC Client (16613795) connection to namenode.tanu.com/169.35.91.237:8020 from hdfs sending #0
17/10/17 15:24:09 DEBUG ipc.Client: IPC Client (16613795) connection to namenode.tanu.com/169.35.91.237:8020 from hdfs got value #0
17/10/17 15:24:09 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 182ms
17/10/17 15:24:10 DEBUG ipc.Client: IPC Client (16613795) connection to namenode.tanu.com/169.35.91.237:8020 from hdfs sending #1
17/10/17 15:24:10 DEBUG ipc.Client: IPC Client (16613795) connection to namenode.tanu.com/169.35.91.237:8020 from hdfs got value #1
17/10/17 15:24:10 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 2ms
Found 7 items
drwxr-x--x   - accumulo accumulo            0 2015-09-30 10:35 /accumulo
drwxr-xr-x   - hbase    hbase               0 2017-09-27 01:53 /hbase
drwxrwxr-x   - solr     solr                0 2017-10-05 06:55 /solr
drwxr-xr-x   - hdfs     supergroup          0 2017-05-02 09:31 /system
drwxrwxrwx   - hdfs     supergroup          0 2017-10-17 15:00 /tmp
drwxrwxrwx   - hdfs     supergroup          0 2017-10-12 12:13 /user
17/10/17 15:24:10 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@38d1b194
17/10/17 15:24:10 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@38d1b194
17/10/17 15:24:10 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@38d1b194
17/10/17 15:24:10 DEBUG ipc.Client: Stopping client
17/10/17 15:24:10 DEBUG ipc.Client: IPC Client (16613795) connection to namenode.tanu.com/169.35.91.237:8020 from hdfs: closed

17/10/17 15:24:10 DEBUG ipc.Client: IPC Client (16613795) connection to namenode.tanu.com/169.35.91.237:8020 from hdfs: stopped, remaining connections 0

No comments:

Post a Comment