1 year ago
#387365
user3458271
Hazelcast Jet not allowing Tomcat to stop
I am using Hazelcast jet to some aggregation and grouping but after being idle for sometime and when I tried to stop my tomcat it is not allowing to stop my tomcat and I have restart my PC. Below is the error which I am getting. Please anyone can guide me what it exactly error is showing and how to shutdown it gracefully?
Sending multicast datagram failed. Exception message saying the operation is not permitted
usually means the underlying OS is not able to send packets at a given pace. It can be caused by starting several hazelcast members in parallel when the members send their join message nearly at the same time.
java.net.NoRouteToHostException: No route to host: Datagram send failed
at java.net.TwoStacksPlainDatagramSocketImpl.send(Native Method)
at java.net.DatagramSocket.send(DatagramSocket.java:693)
at com.hazelcast.internal.cluster.impl.MulticastService.send(MulticastService.java:291)
at com.hazelcast.internal.cluster.impl.MulticastJoiner.searchForOtherClusters(MulticastJoiner.java:113)
at com.hazelcast.internal.cluster.impl.SplitBrainHandler.searchForOtherClusters(SplitBrainHandler.java:75)
at com.hazelcast.internal.cluster.impl.SplitBrainHandler.run(SplitBrainHandler.java:42)
at com.hazelcast.spi.impl.executionservice.impl.DelegateAndSkipOnConcurrentExecutionDecorator$DelegateDecorator.run(DelegateAndSkipOnConcurrentExecutionDecorator.java:77)
at com.hazelcast.internal.util.executor.CachedExecutorServiceDelegate$Worker.run(CachedExecutorServiceDelegate.java:217)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
at com.hazelcast.internal.util.executor.HazelcastManagedThread.executeRun(HazelcastManagedThread.java:76)
at com.hazelcast.internal.util.executor.HazelcastManagedThread.run(HazelcastManagedThread.java:102)
Code is quite huge but I tried show you some sample may it won't work as it is just glimpse of a code:
Class Abc{
// It Create Jet Instance
JetConfig jetConfig = new JetConfig();
jetConfig.getHazelcastConfig().setProperty( "hazelcast.logging.type", "log4j" );
jetConfig.getInstanceConfig().setCooperativeThreadCount(5);
jetConfig.configureHazelcast(c -> {
c.getNetworkConfig().setReuseAddress(true);
c.setClusterName("DATA" + UUID.randomUUID().toString());
c.getNetworkConfig().setPort(9093);
c.getNetworkConfig().setPublicAddress("localhost");
c.getNetworkConfig().setPortAutoIncrement(true);
});
JetInstance jetInstance= Jet.newJetInstance(jetConfig);
public Pipeline createPipeline() {
return Pipeline.create();
}
// To Add Job to pipeline
public void joinPipeToJet(Pipeline pl, String name) {
JobConfig j = new JobConfig();
//j.setProcessingGuarantee(ProcessingGuarantee.EXACTLY_ONCE);
j.setName(name);
jetInstance.newJob(pl,j).join();
}
public void readJsonFile(final Map<String, Object> data) {
// Random Id for job so I can separate two jobs Imaps
String jobid = UUID.randomUUID().toString();
try {
Pipeline pl = createPipeline();
UUID idOne = UUID.randomUUID();
final IMap<Object, Object> abc = jetInstance.getMap(idOne.toString());
abc.putAll(data);
// Reading data from file and sending data to next
final BatchSource batchSource = Sources.map(abc);
pl.readFrom(batchSource)
.writeTo(Sinks.map(this.uid));
joinPipeToJet(pl, jobid);
abc.destroy();
} catch (Exception e) {
Job j1 = jetInstance.getJob(jobid);
if (j1 != null) {
j1.cancel();
}
} finally {
Job j1 = jetInstance.getJob(jobid);
if (j1 != null) {
j1.cancel();
}
}
}
//Process to mainplate data and returning it using BatchStage to Map
public Map<String, Object> runProcess(final Pipeline pl) {
String jobid = UUID.randomUUID().toString();
UID idOne = UUID.randomUUID();
BatchStage<Object> bd1 = ;//get data by calling method
bd1.writeTo(Sinks.list(idOne.toString()));
joinPipeToJet(pl, jobid);
IList<Object> abc = jetInstance.getList(idOne.toString());
List<Object> result = new ArrayList(abc);
final Map<String, Object> finalresult =new HashMap<String, Object>();
finalresult.put("datas", result.get(0));
abc.destroy();
return finalresult;
}
public static void main(String...args) {
Map<String, Object> p = new HashMap<String, Object>();
p.putAll("Some Data");
readJsonFile(p);
Pipeline pl = createPipeline();
runProcess(pl);
}
}
java
hazelcast
hazelcast-jet
0 Answers
Your Answer