Skip to content

Instantly share code, notes, and snippets.

@thesteve0
Last active May 14, 2018 05:12
Show Gist options
  • Save thesteve0/ea35d4c8e46e6962ea3a41726cf4ec7f to your computer and use it in GitHub Desktop.
Save thesteve0/ea35d4c8e46e6962ea3a41726cf4ec7f to your computer and use it in GitHub Desktop.
exec java -javaagent:/opt/agent-bond/agent-bond.jar=jolokia{{host=0.0.0.0}},jmx_exporter{{9779:/opt/agent-bond/jmx_exporter_config.yml}} -Xmx768m -XX:ParallelGCThreads=1 -XX:ConcGCThreads=1 -Djava.util.concurrent.ForkJoinPool.common.parallelism=1 -XX:CICompilerCount=2 -XX:+UseParallelGC -XX:GCTimeRatio=4 -XX:AdaptiveSizePolicyWeight=90 -XX:MinHeapFreeRatio=20 -XX:MaxHeapFreeRatio=40 -XX:+ExitOnOutOfMemoryError -cp .:/maven/* io.thorntail.Thorntail
I> No access restrictor found, access to any MBean is allowed
Jolokia: Agent started with URL http://10.1.2.138:8778/jolokia/
2018-05-14 05:11:19 INFO core:170 - THORN-000001: Thorntail - version 1.0.0-SNAPSHOT
2018-05-14 05:11:19 INFO core:172 - THORN-000010: Thorntail starting
2018-05-14 05:11:19 INFO Version:145 - WELD-000900: 3.0.3 (Final)
2018-05-14 05:11:19 INFO Bootstrap:49 - WELD-ENV-000020: Using jandex for bean discovery
2018-05-14 05:11:21 INFO Bootstrap:219 - WELD-000101: Transactional services not available. Injection of @Inject UserTransaction not available. Transactional observers will be invoked synchronously.
2018-05-14 05:11:22 INFO Bootstrap:46 - WELD-000119: Not generating any bean definitions from io.thorntail.servlet.impl.undertow.metrics.MetricsIntegration because of underlying class loading error: Type org.eclipse.microprofile.metrics.MetricRegistry not found. If this is unexpected, enable DEBUG logging to see the full error.
2018-05-14 05:11:22 INFO Bootstrap:46 - WELD-000119: Not generating any bean definitions from io.thorntail.tracing.jaeger.impl.ReporterConfigurationProducer because of underlying class loading error: Type com.uber.jaeger.senders.Sender not found. If this is unexpected, enable DEBUG logging to see the full error.
2018-05-14 05:11:22 INFO Bootstrap:46 - WELD-000119: Not generating any bean definitions from io.thorntail.tracing.jaeger.impl.JaegerTracerProvider because of underlying class loading error: Type com.uber.jaeger.Configuration not found. If this is unexpected, enable DEBUG logging to see the full error.
2018-05-14 05:11:22 INFO Bootstrap:46 - WELD-000119: Not generating any bean definitions from io.thorntail.tracing.jaeger.impl.ConfigurationProducer because of underlying class loading error: Type com.uber.jaeger.Configuration$SamplerConfiguration not found. If this is unexpected, enable DEBUG logging to see the full error.
2018-05-14 05:11:22 INFO Bootstrap:46 - WELD-000119: Not generating any bean definitions from io.thorntail.tracing.jaeger.impl.SamplerConfigurationProducer because of underlying class loading error: Type com.uber.jaeger.Configuration$SamplerConfiguration not found. If this is unexpected, enable DEBUG logging to see the full error.
2018-05-14 05:11:22 INFO Bootstrap:46 - WELD-000119: Not generating any bean definitions from io.thorntail.tracing.jaeger.impl.HttpSenderProducer because of underlying class loading error: Type com.uber.jaeger.senders.HttpSender not found. If this is unexpected, enable DEBUG logging to see the full error.
2018-05-14 05:11:22 INFO Bootstrap:46 - WELD-000119: Not generating any bean definitions from io.thorntail.tracing.jaeger.impl.UdpSenderProducer because of underlying class loading error: Type com.uber.jaeger.senders.UdpSender not found. If this is unexpected, enable DEBUG logging to see the full error.
2018-05-14 05:11:22 INFO Bootstrap:46 - WELD-000119: Not generating any bean definitions from io.thorntail.tracing.jaeger.impl.SenderProducer because of underlying class loading error: Type com.uber.jaeger.senders.UdpSender not found. If this is unexpected, enable DEBUG logging to see the full error.
2018-05-14 05:11:24 INFO Bootstrap:235 - WELD-ENV-002003: Weld SE container e0bb269f-c9bd-43ae-a6ad-c65790eefd8c initialized
2018-05-14 05:11:24 INFO core:229 - THORN-000013: phase [CDI initialize] completed in 4.911s
2018-05-14 05:11:24 WARN core:39 - THORN-000030: No valid OpenTracing Tracer resolved
2018-05-14 05:11:24 INFO core:43 - THORN-000031: Registered OpenTracing Tracer 'io.opentracing.NoopTracerImpl'
2018-05-14 05:11:24 INFO core:229 - THORN-000013: phase [bootstrap] completed in 0.390s
2018-05-14 05:11:24 INFO jaxrs:37 - THORN-002001: Deployment created for jaxrs-BirdAnalysis
2018-05-14 05:11:24 INFO core:229 - THORN-000013: phase [scan] completed in 0.106s
2018-05-14 05:11:24 INFO core:229 - THORN-000013: phase [initialize] completed in 0.9s
2018-05-14 05:11:25 INFO servletWeldServlet:115 - WELD-ENV-001007: Initialize Weld using ServletContextListener
2018-05-14 05:11:25 INFO servletUndertow:39 - WELD-ENV-001302: Undertow detected, CDI injection will be available in Servlets, Filters and Listeners.
INFO : Mon May 14 05:11:26 UTC 2018 [io.opentracing.contrib.jaxrs2.server.ServerTracingDynamicFeature] Registering tracing on deployed resources...
2018-05-14 05:11:26 INFO core:229 - THORN-000013: phase [deploy] completed in 1.592s
2018-05-14 05:11:26 INFO core:229 - THORN-000013: phase [before start] completed in 0.0s
2018-05-14 05:11:26 INFO xnio:104 - XNIO version 3.3.8.Final
2018-05-14 05:11:26 INFO nio:55 - XNIO NIO Implementation Version 3.3.8.Final
2018-05-14 05:11:26 INFO servlet:191 - THORN-001001: unified : server started at: http://[::]:8080/
2018-05-14 05:11:26 INFO core:229 - THORN-000013: phase [start] completed in 0.396s
2018-05-14 05:11:26 INFO servlet:84 - THORN-001002: unified : deployment jaxrs-BirdAnalysis: /
2018-05-14 05:11:26 INFO core:229 - THORN-000013: phase [after start] completed in 0.0s
2018-05-14 05:11:26 INFO core:221 - THORN-000999: Thorntail started in 7.404s
2018-05-14 05:11:36 INFO ParkSummary:47 - After properties - You Bet
2018-05-14 05:11:36 INFO ParkSummary:48 - Key: schema Value: public
2018-05-14 05:11:36 INFO ParkSummary:48 - Key: database Value: birds
2018-05-14 05:11:36 INFO ParkSummary:48 - Key: port Value: 5432
2018-05-14 05:11:36 INFO ParkSummary:48 - Key: passwd Value: R7xnmB8tkAaFEA0mm8fg
2018-05-14 05:11:36 INFO ParkSummary:48 - Key: dbtype Value: postgis
2018-05-14 05:11:36 INFO ParkSummary:48 - Key: host Value: 172.30.136.235
2018-05-14 05:11:36 INFO ParkSummary:48 - Key: user Value: userxaok5
SEVERE: Mon May 14 05:11:38 UTC 2018 [org.geotools.jdbc] There's code using JDBC based datastore and not disposing them. This may lead to temporary loss of database connections. Please make sure all data access code calls DataStore.dispose() before freeing all references to it
2018-05-14 05:11:38 INFO SparkContext:54 - Running Spark version 2.3.0
2018-05-14 05:11:38 INFO SparkContext:54 - Submitted application: Analyzing Parks In The Birds
2018-05-14 05:11:38 INFO SecurityManager:54 - Changing view acls to: ?,bob
2018-05-14 05:11:38 INFO SecurityManager:54 - Changing modify acls to: ?,bob
2018-05-14 05:11:38 INFO SecurityManager:54 - Changing view acls groups to:
2018-05-14 05:11:38 INFO SecurityManager:54 - Changing modify acls groups to:
2018-05-14 05:11:38 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(?, bob); groups with view permissions: Set(); users with modify permissions: Set(?, bob); groups with modify permissions: Set()
2018-05-14 05:11:39 INFO Utils:54 - Successfully started service 'sparkDriver' on port 36631.
2018-05-14 05:11:40 INFO SparkEnv:54 - Registering MapOutputTracker
2018-05-14 05:11:40 INFO SparkEnv:54 - Registering BlockManagerMaster
2018-05-14 05:11:40 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-05-14 05:11:40 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-05-14 05:11:40 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-1e19caf0-e197-4a60-9f4c-dbc2c09209ba
2018-05-14 05:11:40 INFO MemoryStore:54 - MemoryStore started with capacity 229.8 MB
2018-05-14 05:11:40 INFO SparkEnv:54 - Registering OutputCommitCoordinator
2018-05-14 05:11:40 INFO log:192 - Logging initialized @24427ms
2018-05-14 05:11:40 INFO Server:346 - jetty-9.3.z-SNAPSHOT
2018-05-14 05:11:40 INFO Server:414 - Started @24649ms
2018-05-14 05:11:41 INFO AbstractConnector:278 - Started ServerConnector@cfe8411{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-05-14 05:11:41 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040.
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2134767{/jobs,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4487a761{/jobs/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2af2f5cd{/jobs/job,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1e1f4dc1{/jobs/job/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@675a5dcc{/stages,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5abb8077{/stages/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6e3409c5{/stages/stage,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@391e3f89{/stages/stage/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@77aa634e{/stages/pool,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@740c5034{/stages/pool/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4ed7872f{/storage,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6fd0b0b{/storage/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@22e9a889{/storage/rdd,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3472c110{/storage/rdd/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@46eec64e{/environment,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@28a528cb{/environment/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@47e760b6{/executors,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@65bf431b{/executors/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@55d33554{/executors/threadDump,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5431ac55{/executors/threadDump/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@282f94b5{/static,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@68116dda{/,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@d58d870{/api,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@31f56ffa{/jobs/job/kill,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@23b11907{/stages/stage/kill,null,AVAILABLE,@Spark}
2018-05-14 05:11:41 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://birdanalysis-30-j9svs:4040
2018-05-14 05:11:41 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-05-14 05:11:42 INFO StandaloneAppClient$ClientEndpoint:54 - Connecting to master spark://sparky:7077...
2018-05-14 05:11:43 INFO TransportClientFactory:267 - Successfully created connection to sparky/172.30.15.2:7077 after 104 ms (0 ms spent in bootstraps)
2018-05-14 05:11:43 INFO StandaloneSchedulerBackend:54 - Connected to Spark cluster with app ID app-20180514051143-0013
2018-05-14 05:11:43 INFO StandaloneAppClient$ClientEndpoint:54 - Executor added: app-20180514051143-0013/0 on worker-20180513010601-10.1.4.106-42664 (10.1.4.106:42664) with 16 core(s)
2018-05-14 05:11:43 INFO StandaloneSchedulerBackend:54 - Granted executor ID app-20180514051143-0013/0 on hostPort 10.1.4.106:42664 with 16 core(s), 1024.0 MB RAM
2018-05-14 05:11:43 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40586.
2018-05-14 05:11:43 INFO NettyBlockTransferService:54 - Server created on birdanalysis-30-j9svs:40586
2018-05-14 05:11:43 INFO StandaloneAppClient$ClientEndpoint:54 - Executor updated: app-20180514051143-0013/0 is now RUNNING
2018-05-14 05:11:43 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2018-05-14 05:11:43 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, birdanalysis-30-j9svs, 40586, None)
2018-05-14 05:11:43 INFO BlockManagerMasterEndpoint:54 - Registering block manager birdanalysis-30-j9svs:40586 with 229.8 MB RAM, BlockManagerId(driver, birdanalysis-30-j9svs, 40586, None)
2018-05-14 05:11:43 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, birdanalysis-30-j9svs, 40586, None)
2018-05-14 05:11:43 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, birdanalysis-30-j9svs, 40586, None)
2018-05-14 05:11:44 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1aa5334c{/metrics/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:44 INFO StandaloneSchedulerBackend:54 - SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
2018-05-14 05:11:44 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/maven/spark-warehouse').
2018-05-14 05:11:44 INFO SharedState:54 - Warehouse path is 'file:/maven/spark-warehouse'.
2018-05-14 05:11:44 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@43c6c336{/SQL,null,AVAILABLE,@Spark}
2018-05-14 05:11:44 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1a962194{/SQL/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:44 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2bd01af3{/SQL/execution,null,AVAILABLE,@Spark}
2018-05-14 05:11:44 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@416a04a3{/SQL/execution/json,null,AVAILABLE,@Spark}
2018-05-14 05:11:44 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@38174356{/static/sql,null,AVAILABLE,@Spark}
2018-05-14 05:11:46 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
2018-05-14 05:11:48 INFO StandaloneAppClient$ClientEndpoint:54 - Executor updated: app-20180514051143-0013/0 is now EXITED (Command exited with code 1)
2018-05-14 05:11:48 INFO StandaloneSchedulerBackend:54 - Executor app-20180514051143-0013/0 removed: Command exited with code 1
2018-05-14 05:11:48 INFO StandaloneAppClient$ClientEndpoint:54 - Executor added: app-20180514051143-0013/1 on worker-20180513010601-10.1.4.106-42664 (10.1.4.106:42664) with 16 core(s)
2018-05-14 05:11:48 INFO StandaloneSchedulerBackend:54 - Granted executor ID app-20180514051143-0013/1 on hostPort 10.1.4.106:42664 with 16 core(s), 1024.0 MB RAM
2018-05-14 05:11:48 INFO StandaloneAppClient$ClientEndpoint:54 - Executor updated: app-20180514051143-0013/1 is now RUNNING
2018-05-14 05:11:48 INFO BlockManagerMaster:54 - Removal of executor 0 requested
2018-05-14 05:11:48 INFO CoarseGrainedSchedulerBackend$DriverEndpoint:54 - Asked to remove non-existent executor 0
2018-05-14 05:11:48 INFO BlockManagerMasterEndpoint:54 - Trying to remove executor 0 from BlockManagerMaster.
2018-05-14 05:11:53 INFO StandaloneAppClient$ClientEndpoint:54 - Executor updated: app-20180514051143-0013/1 is now EXITED (Command exited with code 1)
2018-05-14 05:11:53 INFO StandaloneSchedulerBackend:54 - Executor app-20180514051143-0013/1 removed: Command exited with code 1
2018-05-14 05:11:53 INFO StandaloneAppClient$ClientEndpoint:54 - Executor added: app-20180514051143-0013/2 on worker-20180513010601-10.1.4.106-42664 (10.1.4.106:42664) with 16 core(s)
2018-05-14 05:11:53 INFO BlockManagerMaster:54 - Removal of executor 1 requested
2018-05-14 05:11:53 INFO BlockManagerMasterEndpoint:54 - Trying to remove executor 1 from BlockManagerMaster.
2018-05-14 05:11:53 INFO CoarseGrainedSchedulerBackend$DriverEndpoint:54 - Asked to remove non-existent executor 1
2018-05-14 05:11:53 INFO StandaloneSchedulerBackend:54 - Granted executor ID app-20180514051143-0013/2 on hostPort 10.1.4.106:42664 with 16 core(s), 1024.0 MB RAM
2018-05-14 05:11:53 INFO StandaloneAppClient$ClientEndpoint:54 - Executor updated: app-20180514051143-0013/2 is now RUNNING
2018-05-14 05:11:56 WARN SharedState:66 - URL.setURLStreamHandlerFactory failed to set FsUrlStreamHandlerFactory
2018-05-14 05:11:56 ERROR request:80 - UT005023: Exception handling request to /parksummary
org.jboss.resteasy.spi.UnhandledException: org.apache.spark.SparkException: Unable to create database default as failed to create its directory file:/maven/spark-warehouse
at org.jboss.resteasy.core.ExceptionHandler.handleApplicationException(ExceptionHandler.java:78)
at org.jboss.resteasy.core.ExceptionHandler.handleException(ExceptionHandler.java:222)
at org.jboss.resteasy.core.SynchronousDispatcher.writeException(SynchronousDispatcher.java:175)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:418)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:209)
at org.jboss.resteasy.plugins.server.servlet.ServletContainerDispatcher.service(ServletContainerDispatcher.java:227)
at org.jboss.resteasy.plugins.server.servlet.HttpServletDispatcher.service(HttpServletDispatcher.java:56)
at org.jboss.resteasy.plugins.server.servlet.HttpServletDispatcher.service(HttpServletDispatcher.java:51)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:85)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:129)
at io.thorntail.jaxrs.impl.opentracing.servlet.BetterSpanFinishingFilter.doFilter(BetterSpanFinishingFilter.java:55)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at io.undertow.servlet.handlers.FilterHandler.handleRequest(FilterHandler.java:84)
at io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62)
at io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:64)
at io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36)
at io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:131)
at io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46)
at io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64)
at io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60)
at io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77)
at io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:292)
at io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:81)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:138)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:135)
at io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48)
at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43)
at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:272)
at io.undertow.servlet.handlers.ServletInitialHandler.handleRequest(ServletInitialHandler.java:197)
at io.undertow.server.handlers.HttpContinueReadHandler.handleRequest(HttpContinueReadHandler.java:65)
at io.undertow.server.handlers.PathHandler.handleRequest(PathHandler.java:94)
at io.undertow.server.handlers.resource.ResourceHandler$1.handleRequest(ResourceHandler.java:217)
at io.undertow.server.Connectors.executeRootHandler(Connectors.java:332)
at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:830)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Unable to create database default as failed to create its directory file:/maven/spark-warehouse
at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.liftedTree1$1(InMemoryCatalog.scala:115)
at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.doCreateDatabase(InMemoryCatalog.scala:109)
at org.apache.spark.sql.catalyst.catalog.ExternalCatalog.createDatabase(ExternalCatalog.scala:69)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:117)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.catalog$lzycompute(BaseSessionStateBuilder.scala:133)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.catalog(BaseSessionStateBuilder.scala:131)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anon$1.<init>(BaseSessionStateBuilder.scala:157)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.analyzer(BaseSessionStateBuilder.scala:157)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:428)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:233)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
at com.molw.ws.ParkSummary.getParksSummary(ParkSummary.java:74)
at com.molw.ws.ParkSummary$Proxy$_$$_WeldClientProxy.getParksSummary(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:140)
at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTarget(ResourceMethodInvoker.java:294)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:248)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:235)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:402)
... 41 more
Caused by: java.io.IOException: failure to login
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:822)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2755)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2747)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2613)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.liftedTree1$1(InMemoryCatalog.scala:111)
... 72 more
Caused by: javax.security.auth.login.LoginException: java.lang.NullPointerException: invalid null input: name
at com.sun.security.auth.UnixPrincipal.<init>(UnixPrincipal.java:71)
at com.sun.security.auth.module.UnixLoginModule.login(UnixLoginModule.java:133)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:797)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2755)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2747)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2613)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.liftedTree1$1(InMemoryCatalog.scala:111)
at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.doCreateDatabase(InMemoryCatalog.scala:109)
at org.apache.spark.sql.catalyst.catalog.ExternalCatalog.createDatabase(ExternalCatalog.scala:69)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:117)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.catalog$lzycompute(BaseSessionStateBuilder.scala:133)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.catalog(BaseSessionStateBuilder.scala:131)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anon$1.<init>(BaseSessionStateBuilder.scala:157)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.analyzer(BaseSessionStateBuilder.scala:157)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:428)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:233)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
at com.molw.ws.ParkSummary.getParksSummary(ParkSummary.java:74)
at com.molw.ws.ParkSummary$Proxy$_$$_WeldClientProxy.getParksSummary(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:140)
at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTarget(ResourceMethodInvoker.java:294)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:248)
at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:235)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:402)
at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:209)
at org.jboss.resteasy.plugins.server.servlet.ServletContainerDispatcher.service(ServletContainerDispatcher.java:227)
at org.jboss.resteasy.plugins.server.servlet.HttpServletDispatcher.service(HttpServletDispatcher.java:56)
at org.jboss.resteasy.plugins.server.servlet.HttpServletDispatcher.service(HttpServletDispatcher.java:51)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:85)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:129)
at io.thorntail.jaxrs.impl.opentracing.servlet.BetterSpanFinishingFilter.doFilter(BetterSpanFinishingFilter.java:55)
at io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:61)
at io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131)
at io.undertow.servlet.handlers.FilterHandler.handleRequest(FilterHandler.java:84)
at io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62)
at io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:64)
at io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36)
at io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:131)
at io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46)
at io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64)
at io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60)
at io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77)
at io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:292)
at io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:81)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:138)
at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:135)
at io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48)
at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43)
at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43)
at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:272)
at io.undertow.servlet.handlers.ServletInitialHandler.handleRequest(ServletInitialHandler.java:197)
at io.undertow.server.handlers.HttpContinueReadHandler.handleRequest(HttpContinueReadHandler.java:65)
at io.undertow.server.handlers.PathHandler.handleRequest(PathHandler.java:94)
at io.undertow.server.handlers.resource.ResourceHandler$1.handleRequest(ResourceHandler.java:217)
at io.undertow.server.Connectors.executeRootHandler(Connectors.java:332)
at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:830)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
at javax.security.auth.login.LoginContext.invoke(LoginContext.java:856)
at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:797)
... 80 more
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment