Fuse ide how to define database table end point - mysql

I have heard alot of success integration story when comes to Apache Camel with Fuse. HEnce. here Im just starting to explore the Fuse IDE, with just a simple task on top of my head, i would like to achieve:
Read a fix length file
Parse the fix length file
persist it to mysql database table
I am only able to get as far as:
Read the fix length file (with Endpoint "file:src/data/Japan?noop=true")
Define a Marshal with Bindy and Define a POJO package model with #FixedLengthRecord annotation
then i am stuck... HOW TO persist the POJO into mysql database table? I can see some JDBC, IBatis and JPA end point, but how to accomplish that in Fuse IDE?
My POJO package:
package com.mbww.model;
import org.apache.camel.dataformat.bindy.annotation.DataField;
import org.apache.camel.dataformat.bindy.annotation.FixedLengthRecord;
#FixedLengthRecord(length=91)
public class Japan {
#DataField(pos=1, length=10)
private String TNR;
#DataField(pos=11, length=10)
private String ATR;
#DataField(pos=21, length=70)
private String STR;
}

Well you can use all of the following components to actually read and write from the database:
JDBC
IBATIS
MyBATIS
SPRING-JDBC
SQL
Custom Processor
I am going to show you how to use the custom processor to insert the rows into a table. The main reason for this is that you will get to work with the messages and exchange and this will give you more of a insight into Camel. All of the other components can be used by following the documentation on the camel site.
So lets review what you have. You are reading the file and converting the body to a bindy object. So for each line in your text file Camel will send a bindy object of class com.mbww.model.JAPAN to the next end point. This next end point needs to talk to the database. There is one problem I can spot immediately you are using a marshal you should be using a unmarshal.
The documentation clearly states: If you receive a message from one of the Camel Components such as File, HTTP or JMS you often want to unmarshal the payload into some bean so that you can process it using some Bean Integration or perform Predicate evaluation and so forth. To do this use the unmarshal word in the DSL in Java or the Xml Configuration.
Your bindy class looks good but it is missing getters and setters modify the class to look like this:
package com.mbww.model;
import org.apache.camel.dataformat.bindy.annotation.DataField;
import org.apache.camel.dataformat.bindy.annotation.FixedLengthRecord;
#FixedLengthRecord(length=91)
public class Japan {
#DataField(pos=1, length=10)
private String TNR;
#DataField(pos=11, length=10)
private String ATR;
#DataField(pos=21, length=70)
private String STR;
public String getTNR() {
return TNR;
}
public void setTNR(String tNR) {
TNR = tNR;
}
public String getATR() {
return ATR;
}
public void setATR(String aTR) {
ATR = aTR;
}
public String getSTR() {
return STR;
}
public void setSTR(String sTR) {
STR = sTR;
}
}
First you need to create a data source to your database in your route. First thing is to add the mysql driver jar to your maven dependencies open your pom.xml file and add the following dependency to it.
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<!-- use this version of the driver or a later version of the driver -->
<version>5.1.25</version>
</dependency>
Right now we need to declare a custom processor to use in the route that will use this driver and insert the received body into a table.
So lets create a new class in Fuse IDE called PersistToDatabase code below:
package com.mbww.JapanData;
import java.sql.DriverManager;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.util.Map;
import org.apache.camel.Body;
import org.apache.camel.Exchange;
import org.apache.camel.Handler;
import org.apache.camel.Headers;
import com.mbww.model.Japan;
import com.mysql.jdbc.Statement;
public class PersistToDatabase {
#Handler
public void PersistRecord
(
#Body Japan msgBody
, #Headers Map hdr
, Exchange exch
) throws Exception
{
try {
Class.forName("com.mysql.jdbc.Driver");
} catch (ClassNotFoundException e) {
System.out.println("Where is your MySQL JDBC Driver?");
e.printStackTrace();
return;
}
System.out.println("MySQL JDBC Driver Registered!");
Connection connection = null;
try {
connection = DriverManager.getConnection("jdbc:mysql://localhost:3306/databasename","root", "password");
} catch (SQLException e) {
System.out.println("Connection Failed! Check output console");
e.printStackTrace();
return;
}
if (connection != null) {
System.out.println("You made it, take control your database now!");
} else {
System.out.println("Failed to make connection!");
}
try {
PreparedStatement stmt=connection.prepareStatement("INSERT INTO JapanDate(TNR,ATR,STR) VALUES(?,?,?)");
stmt.setString(1, msgBody.getTNR());
stmt.setString(2, msgBody.getATR());
stmt.setString(1, msgBody.getSTR());
int rows = stmt.executeUpdate();
System.out.println("Number of rows inserted: "+Integer.toString(rows));
}
catch(Exception e){
System.out.println("Error in executing sql statement: "+e.getMessage() );
throw new Exception(e.getMessage());
}
}
}
This class is a POJO nothing fancy except the #Handler annotation on the PersistRecord. This annotation tells camel that the PersistRecord method/procedure will handle the message exchange. You will also notice that the method PersistRecord has a parameter of type Japan. As mentioned earlier when you call the conversion bean in your camel route it translates each line into a Japan object and passes it along the route.
The rest of the code is just how to handle the JDBC connection and calling a insert statement.
We are almost done just one last thing to do. We need to declare this class in our camel route xml. This file will typically be called camel-route.xml or blueprint.xml depending on your arch type. Open the source tab and add the following line <bean id="JapanPersist" class="com.mbww.JapanData.PersistToDatabase"/> before the <camelContext> tag.
This declares a new spring bean called JapanPersist based on the class we just added to the camel route. You can now reference this bean inside your camel route.
Thus the final route xml file should look something like this:
<?xml version="1.0" encoding="UTF-8"?>
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:camel="http://camel.apache.org/schema/blueprint"
xsi:schemaLocation="
http://www.osgi.org/xmlns/blueprint/v1.0.0 http://www.osgi.org/xmlns/blueprint/v1.0.0/blueprint.xsd
http://camel.apache.org/schema/blueprint http://camel.apache.org/schema/blueprint/camel-blueprint.xsd">
<bean id="JapanPersist" class="com.mbww.JapanData.PersistToDatabase"/>
<camelContext trace="false" id="blueprintContext" xmlns="http://camel.apache.org/schema/blueprint">
<route id="JapanDataFromFileToDB">
<from uri="file:src/data/japan"/>
<unmarshal ref="Japan"/>
<bean ref="JapanPersist"/>
</route>
</camelContext>
</blueprint>
Or see screen shot below:
Once you understand this technique you can start scaling the solution by using a splitter, connection pooling and threading to do massive amount of concurrent inserts etc.
Using the technique above you learned how to inject your own beans into a camel route which give you the ability to work with the messages directly in code.
I have not tested the code so there will probably be a bug or two but the idea should be clear.

Related

XML to JSON in Camel Routing not working

I am trying to convert the XML message to JSON using camel router and save it into a file. Getting the XML message from the source and saving it to destination file etc are working. But when I try to convert to JSON, it did not work. I did not even throw any error/exception in logs. I am running on OSGI container
public class CamelRouter extends RouteBuilder {
#Override
public void configure() throws Exception {
from("file://C:/test/Sample.xml")
.routeId("file-to-file")
.log(LoggingLevel.INFO,"RouteID file-to-file !!!!! starting")
//From XML to JSON
.bean(CamelRouter.class, "convertXmlToJson")
.log(LoggingLevel.INFO,"From XML to JSON !!!!! Done")
.to("file://C:/test/JSONMessages")
.log(LoggingLevel.INFO,"Converted Message Saved successfully");
The bean method to convert XML to JSON convertXmlToJson is shown below
public String convertXmlToJson(String msg) {
log.info("NOW calling JSON conversion");
String jsonStr = null;
log.info("MESSAGE conversion starting : "); //After this message nothing happened
XMLSerializer xmlReader = new XMLSerializer();
log.info("MESSAGE before conversion : " + msg);
jsonStr = xmlReader.read(msg).toString();
log.info("JSON data : " + jsonObj.toString());
return jsonObj.toString();
}
Is anyone know why it is not executing the XMLSerializer portion. I tried this approach because the camel-xmljson's marshal().xmljson() call also give me the same results. Nothing happened after the xmljson() call in my camel routing.
Things that I checked are:
camel-xmljson feature up and running in OSGI
Dependencies mentioned in the Apache XmlJSON website added in my pom file, xom, camel-xmljson etc.
Am I missing anything here? Please help
The problem with your code route is that your bean component handler method resides within your route builder class, plus you invoke the bean component in a way that triggers another instantiation of that route builder class.
Personally, I would move convertXmlToJson to an appropriate utility class. That way you reduce mix of concern in the route builder and the bean component should work fine.
Alternatively, your route might work, if you invoke the bean component like this:
.bean(this, "convertXmlToJson")

In Cypher, How to modify valid URL protocols for LOAD CSV command

This question has two parts:
By default, what URL protocols are considered valid for specifying resources to Cypher's LOAD CSV command?
So far, I've successfully loaded CSV files into Neo4j using http and file protocols. A comment on this unrelated question indicates that ftp works as well, but I haven't had tried this because I have no use case.
What practical options do I have to configure non-standard URI protocols? I'm running up against a Neo.TransientError.Statement.ExternalResourceFailure: with "Invalid URL specified (unknown protocol)". Other than digging into the Neo4j source, is there anyway to modify this validation/setting, provided that the host machine is capable of resolving the resource with the specified protocol?
Neo4j relies on the capabilities of the JVM. According to https://docs.oracle.com/javase/7/docs/api/java/net/URL.html the default protocols are:
http, https, ftp, file, jar
Please note that file URLs are interpreted from the server's point of view and not from the client side (a common source of confusion).
To use custom URLs you need to understand how the JVM deals with those. The javadocs for URL class explain an approach by using a system property to provide custom URL handlers. It should be good enough to provide this system property in neo4j-wrapper.conf and drop the jar file containing your handler classes into the plugins folder. (Note: I did not validate that approach myself, but I'm pretty confident that it will work).
Here is a complete example, using the technique of implementing your own URLStreamHandler to handle the resource protocol. You must name your class 'Handler', and the last segment of the package name must be the protocol name (in this case, resource)
src/main/java/com/example/protocols/resource/Handler.java:
package com.example.protocols.resource;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.net.URL;
import java.net.URLConnection;
import java.net.URLStreamHandler;
public class Handler extends URLStreamHandler {
private final ClassLoader classLoader;
public Handler() {
this.classLoader = getClass().getClassLoader();
}
#Override
protected URLConnection openConnection(URL url) throws IOException {
URL resource = classLoader.getResource(url.getPath());
if (resource == null) {
throw new FileNotFoundException("Resource file not found: " + url.getPath());
}
return resource.openConnection();
}
}
From here, we need to set the system property java.protocol.handler.pkgs to include the base package com.example.protocols so that the protocol is registered. This can be done statically in a Neo4j ExtensionFactory. Since the class gets loaded by Neo4j, we know that the static block will be executed. We also need to provide our own URLAccessRule, since Neo4j by default only allows use of a few select protocols. This can also happen in the ExtensionFactory.
src/main/java/com/example/protocols/ProtocolInitializerFactory.java:
package com.example.protocols;
import org.neo4j.annotations.service.ServiceProvider;
import org.neo4j.graphdb.security.URLAccessRule;
import org.neo4j.kernel.extension.ExtensionFactory;
import org.neo4j.kernel.extension.ExtensionType;
import org.neo4j.kernel.extension.context.ExtensionContext;
import org.neo4j.kernel.lifecycle.Lifecycle;
import org.neo4j.kernel.lifecycle.LifecycleAdapter;
#ServiceProvider
public class ProtocolInitializerFactory extends ExtensionFactory<ProtocolInitializerFactory.Dependencies> {
private static final String PROTOCOL_HANDLER_PACKAGES = "java.protocol.handler.pkgs";
private static final String PROTOCOL_PACKAGE = ProtocolInitializerFactory.class.getPackageName();
static {
String currentValue = System.getProperty(PROTOCOL_HANDLER_PACKAGES, "");
if (currentValue.isEmpty()) {
System.setProperty(PROTOCOL_HANDLER_PACKAGES, PROTOCOL_PACKAGE);
} else if (!currentValue.contains(PROTOCOL_PACKAGE)) {
System.setProperty(PROTOCOL_HANDLER_PACKAGES, currentValue + "|" + PROTOCOL_PACKAGE);
}
}
public interface Dependencies {
URLAccessRule urlAccessRule();
}
public ProtocolInitializerFactory() {
super(ExtensionType.DATABASE, "ProtocolInitializer");
}
#Override
public Lifecycle newInstance(ExtensionContext context, Dependencies dependencies) {
URLAccessRule urlAccessRule = dependencies.urlAccessRule();
return LifecycleAdapter.onInit(() -> {
URLAccessRule customRule = (config, url) -> {
if ("resource".equals(url.getProtocol())) { // Check the protocol name
return url; // Optionally, you can validate the URL here and throw an exception if it is not valid or should not be allowed access
}
return urlAccessRule.validate(config, url);
};
context.dependencySatisfier().satisfyDependency(customRule);
});
}
}
After setting this up, follow the guide to packaging these classes as a Neo4j plugin and drop it into your database's plugins directory.
Admittedly, needing to override the default URLAccessRule feels a little bit shady. It may be better to simply implement the URLStreamHandler, and use another CSV loading method like APOC's apoc.load.csv. This will not require overriding the URLAccessRule, but it will require setting the Java system property java.protocol.handler.pkgs.

Dependency injection using ".properties" file

I am using Java EE 6 and need to load configuration from a ".properties" file. Is there a recommended way (best practice) to load the values ​​from the configuration file using dependency injection? I found annotations for this in Spring, but I have not found a "standard" annotation for Java EE.
This guy have developed a solution from scratch:
http://weblogs.java.net/blog/jjviana/archive/2010/05/18/applicaction-configuration-java-ee-6-using-cdi-simple-example
"I couldn't find a simple example of how to configure your application
with CDI by reading configuration attributes from a file..."
But I wonder if there is a more standard way instead of creating a configuration factory...
Configuration annotation
package com.ubiteck.cdi;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import javax.enterprise.util.Nonbinding;
import javax.inject.Qualifier;
#Qualifier
#Retention(RetentionPolicy.RUNTIME)
public #interface InjectedConfiguration {
/**
* Bundle key
* #return a valid bundle key or ""
*/
#Nonbinding String key() default "";
/**
* Is it a mandatory property
* #return true if mandator
*/
#Nonbinding boolean mandatory() default false;
/**
* Default value if not provided
* #return default value or ""
*/
#Nonbinding String defaultValue() default "";
}
The configuration factory could look like :
import java.text.MessageFormat;
import java.util.MissingResourceException;
import java.util.ResourceBundle;
import javax.enterprise.inject.Produces;
import javax.enterprise.inject.spi.InjectionPoint;
public class ConfigurationInjectionManager {
static final String INVALID_KEY="Invalid key '{0}'";
static final String MANDATORY_PARAM_MISSING = "No definition found for a mandatory configuration parameter : '{0}'";
private final String BUNDLE_FILE_NAME = "configuration";
private final ResourceBundle bundle = ResourceBundle.getBundle(BUNDLE_FILE_NAME);
#Produces
#InjectedConfiguration
public String injectConfiguration(InjectionPoint ip) throws IllegalStateException {
InjectedConfiguration param = ip.getAnnotated().getAnnotation(InjectedConfiguration.class);
if (param.key() == null || param.key().length() == 0) {
return param.defaultValue();
}
String value;
try {
value = bundle.getString(param.key());
if (value == null || value.trim().length() == 0) {
if (param.mandatory())
throw new IllegalStateException(MessageFormat.format(MANDATORY_PARAM_MISSING, new Object[]{param.key()}));
else
return param.defaultValue();
}
return value;
} catch (MissingResourceException e) {
if (param.mandatory()) throw new IllegalStateException(MessageFormat.format(MANDATORY_PARAM_MISSING, new Object[]{param.key()}));
return MessageFormat.format(INVALID_KEY, new Object[]{param.key()});
}
}
Tutorial with explanation and Arquillian test
Even though it does not exactly cover your question, this part of the Weld documentation might be of interest for you.
Having mentioned this - no, there is no standard way to inject arbitrary resources / resource files. I guess it's simply beyond the scope of a spec to standardise such highly custom-dependent requirement (Spring is no specification, they can simply implement whatever they like). However, what CDI provides is a strong (aka typesafe) mechanism to inject configuration-holding beans on one side, and a flexible producer mechanism to read and create such beans on the other side. Definitely this is the recommended way you were asking about.
The approach you are linking to is certainly a pretty good one - even though it might me too much for your needs, depending on the kind of properties you are planning to inject.
A very CDI-ish way of continuing would be to develop a CDI extension (that would nicely encapsulate all required classes) and deploy it independently with your projects. Of course you can also contribute to the CDI-extension catalog or even Apache Deltaspike.
See #ConfigProperty of Apache DeltaSpike
The only "standard" way of doing this would be to use a qualifier with a nonbinding annotation member, and make sure all of your injections are dependent scoped. Then in your producer you can get a hold of the InjectionPoint and get the key off the qualifier in the injection point. You'd want something like this:
#Qualifier
public #interface Property {
#Nonbinding String value default "";
}
...
#Inject #Property("myKey") String myKey;
...
#Produces #Property public String getPropertyByKey(InjectionPoint ip) {
Set<Annotation> qualifiers = ip.getQualifiers
// Loop through qualifers looking for Property.class save that off
return ResourceBundle.getBundle(...).getString(property.key);
}
There are obviously some enhancements you can do to that code, but it should be enough to get you started down the right track.

GWT + EJB + MYSQL

I have got some Question concerning Serialization and persistence.
At First I have got a GWT project with Client code and a Servlet to communicate with
my EJB Project.
In the EJB project there are some Persistent Entitie Classes with references among each other and beans to manage them.
The Reference may look like this:
Object A
/ \
Object B Object C
\
Object D
Mostly there are 1:n Relationships, which i have to modelling with oneToMany or something like this..
I store them into a MYSQL Database which already work with Strings.
With Strings I haven't got Problems to transfer them from the GWT Client Side over the GWt Servlet to the EJB Bean and then into the Database and the same way back to the Client Side.
But when I try to transfer an own created Class object (POJO?) between GWT Client and EJB, I always get an Serialization Exception.
Is it because of the GWT Servlet? I read something that you have to use DTo or Value Objects? Is this correct?
or isn't there a easy way to solve this?
Please see
http://code.google.com/webtoolkit/doc/latest/DevGuideServerCommunication.html#DevGuideSerializableTypes
All classes that conform to the above specification
or implement com.google.gwt.user.client.rpc.IsSerializable can be serialized.
For example:
import com.google.gwt.user.client.rpc.IsSerializable;
import java.util.HashMap;
public class Row implements IsSerializable
{
private HashMap _row;
public Row()
{
_row = new HashMap();
}
public Row(HashMap row)
{
_row = row;
}
public Object getCellValue(String columnName)
{
return _row.get(columnName);
}
public void setCellValue(String columnName, Object value)
{
_row.put(columnName, value);
}
public HashMap getRow()
{
return _row;
}
}
In the documentation there is also the link below, I've never tried that
http://code.google.com/webtoolkit/doc/latest/DevGuideServerCommunication.html#DevGuideCustomSerialization

Lazy Loadng error in JSON serializer

I have such kind of #OneToOne Hibernate relationShip
public class Address implements Serializable {
private String id;
private String city;
private String country;
//setter getters ommitted
}
public class Student implements Serializable {
private String id;
private String firstName;
private String lastName;
private Address address;
}
address Item is mapped as LAZY.
Now I want to fetch user and it's address using
session.load(Student.class,id);
In my daoService.
Then I return it as JSON from my Spring MVC controller:
#RequestMapping(value="/getStudent.do",method=RequestMethod.POST)
#ResponseBody
public Student getStudent(#RequestParam("studentId") String id){
Student student = daoService.getStudent(id);
return student;
}
Unfortunately, it's not working because of Lazy clasees and I fails with:
org.codehaus.jackson.map.JsonMappingException: No serializer found for class org.hibernate.proxy.pojo.javassist.JavassistLazyInitializer and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationConfig.Feature.FAIL_ON_EMPTY_BEANS) ) (through reference chain: com.vanilla.objects.Student_$$_javassist_1["address"]->com.vanilla.objects.Address_$$_javassist_0["handler"])
at org.codehaus.jackson.map.ser.StdSerializerProvider$1.serialize(StdSerializerProvider.java:62)
I do use OpenSessionInViewInterceptor and it works just fine.
I understand that I can user left join HQL query and retrieve student and address that way and solve the problem. I also understand that changing relation to EAGER will solve it.
But how can I serialize to JSON lazy classes using standard jackson message converter which of cause I added to my XML file.
The easiest solution: Don't serialize entities, use Value Objects.
If that is not an option for you, make sure that the entity Object is detached.
With JPA (2), you would use EntityManager.detach(entity), with plain Hibernate the equivalent is Session.evict(entity).
Once I write a processor to handle this but now it's easy to fix this by using the jackson hibernate module.
Within your DAO method add Hibernate.initialize(<your getter method>); to resolve this.
Student student = findById(<yourId>);
Hibernate.initialize(student.getAddress());
...
return student;
Try like the above.
There is another option that solves your problems. You can add this filter in web.xml
<filter>
<filter-name>springOpenEntityManagerInViewFilter</filter-name>
<filter-class>org.springframework.orm.jpa.support.OpenEntityManagerInViewFilter</filter-class>
<init-param>
<param-name>entityManagerFactoryBeanName</param-name>
<param-value>entityManagerFactory</param-value>
</init-param>
</filter>
<filter-mapping>
<filter-name>springOpenEntityManagerInViewFilter</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>
The problem is that entities are loaded lazy and serialization happens before they get loaded fully.
But how can I serialize to JSON lazy classes using standard jackson
message converter which of cause I added to my XML file.
First of all, I don't advise to use DTO/Value Object only to solve this issue.
You may find it easy at the beginning but at each new development/change, the duplicate code means making twice modifications at each time... otherwise bugs.
I don't mean that VO or DTO are bad smells but you should use them for reasons they are designed (such as providing a content/structure that differs according to logical layers or solving an unsolvable serialization problem).
If you have a clean and efficient way to solve the serialization issue without VO/DTO and you don't need them, don't use them.
And about it, there is many ways to solve lazy loading issue as you use Jackson with Hibernate entities.
Actually, the simplest way is using FasterXML/jackson-datatype-hibernate
Project to build Jackson module (jar) to support JSON serialization
and deserialization of Hibernate (http://hibernate.org) specific
datatypes and properties; especially lazy-loading aspects.
It provides Hibernate3Module/Hibernate4Module/Hibernate5Module, extension modules that can be registered with ObjectMapper to provide a well-defined set of extensions related to Hibernate specificities.
To do it working, you just need to add the required dependency and to add the
Jackson Module available during processings where it is required.
If you use Hibernate 3 :
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-hibernate3</artifactId>
<version>${jackson.version.datatype}</version>
</dependency>
If you use Hibernate 4 :
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-hibernate4</artifactId>
<version>${jackson.version.datatype}</version>
</dependency>
And so for...
Where jackson.version.datatype should be the same for the used Jackson version and the ackson-datatype extension.
If you use or may use Spring Boot, you just need to declare the module as a bean in a specific Configuration class or in the SpringBootApplication class and it will be automatically registered for any Jackson ObjectMapper created.
The 74.3 Customize the Jackson ObjectMapper Spring Boot section states that :
Any beans of type com.fasterxml.jackson.databind.Module will be
automatically registered with the auto-configured
Jackson2ObjectMapperBuilder and applied to any ObjectMapper instances
that it creates. This provides a global mechanism for contributing
custom modules when you add new features to your application.
For example :
#Configuration
public class MyJacksonConfig {
#Bean
public Module hibernate5Module() {
return new Hibernate5Module();
}
}
or :
#SpringBootApplication
public class AppConfig {
public static void main(String[] args) throws IOException {
SpringApplication.run(AppConfig.class, args);
}
#Bean
public Module hibernate5Module() {
return new Hibernate5Module();
}
}