I'm trying to deploy a very simple application on Openshift. It's an EAR project with a single WAR and EJB module. Inside the WAR there's a REST service that calls an EJB defined in EJB module. Locally and on Openshift I'm using Wildfly 9.0.0 CR2 and PostgreSQL 9.2. When deploying locally everything works fine. When the same code is deployed on Openshift I'm getting following errors in logs:
2015-06-28 18:23:04,574 WARN [org.jboss.as.clustering.jgroups] (MSC service thread 1-1) WFLYCLJG0006: property bind_addr for protocol org.jgroups.protocols.TCP attempting to override socket binding value 127.12.77.1 : property value 127.12.77.1 will be ignored
2015-06-28 18:23:04,574 WARN [org.jboss.as.clustering.jgroups] (MSC service thread 1-1) WFLYCLJG0006: property bind_port for protocol org.jgroups.protocols.TCP attempting to override socket binding value 7600 : property value 7600 will be ignored
2015-06-28 18:23:06,252 INFO [org.jboss.as.jpa] (ServerService Thread Pool -- 70) WFLYJPA0010: Starting Persistence Unit (phase 2 of 2) Service 'http://ift.tt/1TYvqCX'
2015-06-28 18:23:08,165 ERROR [org.jboss.msc.service.fail] (MSC service thread 1-1) MSC000001: Failed to start service jboss.jgroups.channel.ee: org.jboss.msc.service.StartException in service jboss.jgroups.channel.ee: java.security.PrivilegedActionException: java.net.BindException: [TCP] /127.12.77.1 is not a valid address on any local network interface
at org.wildfly.clustering.jgroups.spi.service.ChannelBuilder.start(ChannelBuilder.java:79)
at org.jboss.msc.service.ServiceControllerImpl$StartTask.startService(ServiceControllerImpl.java:1948)
at org.jboss.msc.service.ServiceControllerImpl$StartTask.run(ServiceControllerImpl.java:1881)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.security.PrivilegedActionException: java.net.BindException: [TCP] /127.12.77.1 is not a valid address on any local network interface
at org.wildfly.security.manager.WildFlySecurityManager.doChecked(WildFlySecurityManager.java:638)
at org.jboss.as.clustering.jgroups.JChannelFactory.createChannel(JChannelFactory.java:99)
at org.wildfly.clustering.jgroups.spi.service.ChannelBuilder.start(ChannelBuilder.java:77)
... 5 more
Caused by: java.net.BindException: [TCP] /127.12.77.1 is not a valid address on any local network interface
at org.jgroups.util.Util.checkIfValidAddress(Util.java:3480)
at org.jgroups.stack.Configurator.ensureValidBindAddresses(Configurator.java:902)
at org.jgroups.stack.Configurator.setupProtocolStack(Configurator.java:118)
at org.jgroups.stack.Configurator.setupProtocolStack(Configurator.java:57)
at org.jgroups.stack.ProtocolStack.setup(ProtocolStack.java:477)
at org.jgroups.JChannel.init(JChannel.java:854)
at org.jgroups.JChannel.<init>(JChannel.java:159)
at org.jboss.as.clustering.jgroups.JChannelFactory$1.run(JChannelFactory.java:96)
at org.jboss.as.clustering.jgroups.JChannelFactory$1.run(JChannelFactory.java:93)
at org.wildfly.security.manager.WildFlySecurityManager.doChecked(WildFlySecurityManager.java:634)
... 7 more
The address mentioned - 127.12.77.1 is $OPENSHIFT_WILDFLY_IP. I have no idea what is causing this issue. First I thought it's a database connectivity issue because it happens when 2nd phase of starting persistence unit happens. I connected to DB on Openshift and saw that it was created successfully so maybe that's not it, but here's the persistence.xml I'm using:
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" xmlns="http://ift.tt/UICAJV" xmlns:xsi="http://ift.tt/ra1lAU"
xsi:schemaLocation="http://ift.tt/UICAJV http://ift.tt/O9YdEP">
<persistence-unit name="cookingPU">
<provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
<jta-data-source>java:jboss/datasources/PostgreSQLDS</jta-data-source>
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.PostgreSQL9Dialect" />
</properties>
</persistence-unit>
</persistence>
The datasource used is the default one. I didn't change anything in standalone.xml.
Another thing I noticed is that the deploy problem happens when I add any EJB to the project. This is a simple one I tried to use:
@Stateless
public class AnyEjb {
public String hello() {
return "Hi there!";
}
}
This is defined in EJB module. Then in web module I have this class calling it:
@Path("anything")
@Consumes(MediaType.APPLICATION_JSON)
@Produces(MediaType.APPLICATION_JSON)
public class AnyEndpoint {
@EJB
private AnyEjb anyEjb;
@GET
public String sayHi() {
return anyEjb.hello();
}
}
I'm not sure if and how it can be connected with this BindException. I've tried running this application locally with both standalone and standalone-full-ha profile and it works in both cases. I just feel it has to be some issue with Openshift configuration but I have no idea where to look anymore. I'm very new to Openshift and Java EE. Please point me in a right direction. Any help will be much appreciated.
Aucun commentaire:
Enregistrer un commentaire