Google

Jun 30, 2013

Core Java Coding Interview Question -- validating an input String to be alphanumeric

Q. Can you write sample code as to different ways in which you can validate an input String to return true if the input is alpha numeric and otherwise return false?
A. It can be done a number of different ways.
  1. Using regex.
  2. Without using regex.
 Firstly, using regex

Step 1: Have the right dependency jars in the pom.xml file as shown below.

<dependency>
 <groupId>commons-lang</groupId>
 <artifactId>commons-lang</artifactId>
 <version>2.6</version>
</dependency>

<dependency>
 <groupId>junit</groupId>
 <artifactId>junit</artifactId>
 <version>4.10</version>
 <scope>test</scope>
</dependency>


Step 2: The implementation class.

package com.mycompany.app5;

import org.apache.commons.lang.StringUtils;

public class AlphaNumeric
{
    
    public boolean isAlphaNumeric(String input)
    {
        //fail fast
        if (StringUtils.isEmpty(input))
        {
            throw new IllegalArgumentException("The input cannot be empty !!!");
        }
        
        boolean result = false;
        
        if (input.matches("[a-zA-Z0-9]+")) // one or more alpha numeric characters
        {
            result = true;
        }
        
        return result;
        
    }
}


Step 3:The JUnit test class.

package com.mycompany.app5;

import junit.framework.Assert;

import org.junit.Before;
import org.junit.Test;

public class AlphaNumericTest
{ 
    AlphaNumeric an;
    
    @Before
    public void setUp()
    {
        an = new AlphaNumeric();
    }
    
    @Test
    public void testAlphaNumericHappyPath()
    {
        Assert.assertEquals(true, an.isAlphaNumeric("AUD123"));
        Assert.assertEquals(true, an.isAlphaNumeric("123aud"));
        Assert.assertEquals(true, an.isAlphaNumeric("123"));
        Assert.assertEquals(true, an.isAlphaNumeric("hgD"));
        
        Assert.assertEquals(false, an.isAlphaNumeric("hg*D"));
        Assert.assertEquals(false, an.isAlphaNumeric("@hgD"));
        Assert.assertEquals(false, an.isAlphaNumeric("@hgD"));
        Assert.assertEquals(false, an.isAlphaNumeric("@*#"));
    }
    
    @Test(expected = java.lang.IllegalArgumentException.class)
    public void testAlphaNumericUnHappyPath1()
    {
        Assert.assertEquals(true, an.isAlphaNumeric(null));
    }
    
    @Test(expected = java.lang.IllegalArgumentException.class)
    public void testAlphaNumericUnHappyPath2()
    {
        Assert.assertEquals(true, an.isAlphaNumeric(""));
    }
}

Using apache's commons-lang package's method directly as shown below.

package com.mycompany.app5;

import org.apache.commons.lang.StringUtils;

public class AlphaNumeric
{
    
    public boolean isAlphaNumeric(String input)
    {
        //fail fast
        if (StringUtils.isEmpty(input))
        {
            throw new IllegalArgumentException("The input cannot be empty !!!");
        }
        
        boolean result = false;
        
        if (StringUtils.isAlphanumeric(input)) // one or more alpha numeric characters
        {
            result = true;
        }
        
        return result;
        
    }
}

Using the core Java without any framework

package com.mycompany.app5;

import org.apache.commons.lang.StringUtils;

public class AlphaNumeric
{
    
    public boolean isAlphaNumeric(String input)
    {
        //fail fast
        if (StringUtils.isEmpty(input))
        {
            throw new IllegalArgumentException("The input cannot be empty !!!");
        }
        
        boolean result = true;
        
        //chack each character agaianst the ASCII table hex value
        for (int i = 0; i < input.length(); i++)
        {
            char c = input.charAt(i);
            if (c < 0x30 || (c >= 0x3a && c <= 0x40) || (c > 0x5a && c <= 0x60) || c > 0x7a)
            {
                result = false;
                break;
            }
        }
        
        return result;
        
    }
}

Alternatively, using the Character class.

package com.mycompany.app5;

import org.apache.commons.lang.StringUtils;

public class AlphaNumeric
{
    
    public boolean isAlphaNumeric(String input)
    {
        //fail fast
        if (StringUtils.isEmpty(input))
        {
            throw new IllegalArgumentException("The input cannot be empty !!!");
        }
        
        boolean result = true;
        
        //chack each character agaianst the ASCII table hex value
        for (int i = 0; i < input.length(); i++)
        {
            char c = input.charAt(i);
            if (!Character.isDigit(c) && !Character.isLetter(c))
            {
                result = false;
                break;
            }
        }
        
        return result;
        
    }
}

Labels: ,

Jun 29, 2013

JasperReports 5.0 tutorial with iReport 5.0

Step 1: Define a Person.java POJO class as defined below. This is the Java bean data source that is going to provide the data to the report.

package com.mycompany.app.jasper;

public class Person
{
    private String firstName;
    private String surname;
    private Integer age;
    
    public String getFirstName()
    {
        return firstName;
    }
    
    public void setFirstName(String firstName)
    {
        this.firstName = firstName;
    }
    
    public String getSurname()
    {
        return surname;
    }
    
    public void setSurname(String surname)
    {
        this.surname = surname;
    }
    
    public Integer getAge()
    {
        return age;
    }
    
    public void setAge(Integer age)
    {
        this.age = age;
    }
    
    @Override
    public String toString()
    {
        return "Person [firstName=" + firstName + ", surname=" + surname + ", age=" + age + "]";
    } 
}


Step 2: Download Jaspersoft iReport 5.0.0, and execute the  ireport.exe from the iReport folder (e.g. C:\ireport\5.0.0\iReport-5.0.0\bin). You need to design the template now using iReport 5.0.0 and generate the person-template.jrxml, which will be used in Java as the template to generate the report.

Step 3: Tell iReport where to find the classes by defining the class path via Tools --> Option, and then select the "Classpath" tab.


Step 4: Create a new report via File --> New.


provide the name and path where you want to generate the template jrxml file.


Click on "Next" and then "Finish" to get the designer screen where you can add labels and text fields for the report.

Step 5: Add the column header for the report by dragging and dropping the "Static Text" for the column headers.


Step 6: Before you can map the Text Field with bean data source values, you need to bring in the Person java class we created in Step 1.Click on the little data source icon as shown below to get the pop up that allows you to define the com.mycompany.app.jasper.Person bean and select the firstName, surname, and age and select on "Add selected fields" and then click on "OK".


Step 7: Now you can map these fields to the new "Text Field" that you drag and drop as shown below. This will be done in the "detail1" section.


Step 8: Now, you need to map each Text Field to the corresponding fields of Person.java. You do this by Right-clicking and selecting "Edit expression" on the field.




Step 9: In the "expression editor", you can map the field. Remove the default $F{Field} and "double-click" on "firstName" to get $F{firstName} and click on "Apply" as shown below. 



Step 10: Map all the fields as shown below, and click on the icon that compiles the design to generate the person-template.jrxml (text) and person-template.jasper (binary) files. If there are any compile errors, you need to fix it and re-compile. You only need the jrxml file to generate report in Java.



Step 11: Before you write the Main.java class, you need to have the relevant Jasper jar files. Here is the pom.xml file with the relevant dependency jar files. 

<!-- Jasper reports -->
<dependency>
 <groupId>net.sf.jasperreports</groupId>
 <artifactId>jasperreports</artifactId>
 <version>5.0.1</version>
</dependency>
<dependency>
 <groupId>org.codehaus.groovy</groupId>
 <artifactId>groovy-all</artifactId>
 <version>2.1.1</version>
</dependency>
<dependency>
 <groupId>com.lowagie</groupId>
 <artifactId>itext</artifactId>
 <version>2.1.7</version>
</dependency>


Step 12: Finally the Main.java that generates the report by combining the layout template person-template.jrxml and the data by executing the Main class.

package com.mycompany.app.jasper; import java.io.IOException; import java.io.InputStream; import java.util.Collection; import java.util.LinkedList; import java.util.List; import net.sf.jasperreports.engine.JRException; import net.sf.jasperreports.engine.JasperCompileManager; import net.sf.jasperreports.engine.JasperFillManager; import net.sf.jasperreports.engine.JasperPrint; import net.sf.jasperreports.engine.JasperReport; import net.sf.jasperreports.engine.data.JRBeanCollectionDataSource; import net.sf.jasperreports.engine.design.JasperDesign; import net.sf.jasperreports.engine.xml.JRXmlLoader;
import net.sf.jasperreports.view.JasperViewer;

public class Main
{
    
    public static JasperDesign jasperDesign;
    public static JasperPrint jasperPrint;
    public static JasperReport jasperReport;
    public static String reportTemplateUrl = "person-template.jrxml";
    
    public static void main(String[] args) throws IOException
    {
        try
        {
            InputStream resourceAsStream = Thread.currentThread().getContextClassLoader()
                    .getResourceAsStream(reportTemplateUrl);
            //get report file and then load into jasperDesign
            jasperDesign = JRXmlLoader.load(resourceAsStream);
            //compile the jasperDesign
            jasperReport = JasperCompileManager.compileReport(jasperDesign);
            //fill the ready report with data and parameter
            jasperPrint = JasperFillManager.fillReport(jasperReport, null,
                    new JRBeanCollectionDataSource(
                            findReportData()));
            //view the report using JasperViewer
            JasperViewer.viewReport(jasperPrint);
        }
        catch (JRException e)
        {
            e.printStackTrace();
        }
    }
    
    private static Collection findReportData()
    {
        //declare a list of object
        List<Person> data = new LinkedList<Person>();
        Person p1 = new Person();
        p1.setFirstName("John");
        p1.setSurname("Smith");
        p1.setAge(Integer.valueOf(5));
        data.add(p1);
        return data;
    } 
}

Finally, the report produced looks like



Note: iText jar is required to generate PDF reports. If you want to generate excel report, you need additional dependency jar like POI.

Labels:

Jun 28, 2013

Advanced Apache Camel Parallel Processing Tutorial

In the previous advanced Apache Camel tutorial we looked at sequential multicasting .  In this tutorial, I will discuss multitasking with parallel processing. To multitask, you need to define an executor service first. This can be done via applicationContext.xml file as shown below.

  
        <bean id="executor" class="java.util.concurrent.Executors"
  factory-method="newFixedThreadPool">
  <constructor-arg index="0" value="16" />
 </bean>


Then in the adv_file_route.xml file reference the "executor" along with other attributes as shown below

   
        <route id="advFileConversion">
   <from
    uri="file://{{adv.in.dir}}?include={{adv.in.file.pattern}}&delay={{poll.delay}}&initialDelay={{initial.delay}}&move=archive/" />

   <multicast stopOnException="true" parallelProcessing="true" executorServiceRef="executor">
    <to uri="direct:email" />
    <to uri="direct:transfer" />
   </multicast>
  </route>


When using File components to consume files, you can define your own thread pool as shown below. This option allows you to share a thread pool among multiple file consumers.

 
<camel:threadPool id="scheduledExecutorService" 
  poolSize="${scheduledExecutorService.pool.size}" 
  maxPoolSize="${scheduledExecutorService.pool.size.max}" 
  scheduled="true" 
  threadName="scheduledExecutorService" />


 
<route id="advFileConversion">
  <from uri="file://{{adv.in.dir}}?include={{adv.in.file.pattern}}&delay={{poll.delay}}&initialDelay={{initial.delay}}
        &move=archive/&scheduledExecutorService=#scheduledExecutorService" />

  <multicast stopOnException="true" parallelProcessing="true" executorServiceRef="executor">
   <to uri="direct:email" />
   <to uri="direct:transfer" />
  </multicast>
</route>


As shown above, the file consumer uses the threadpool with the following attribute name and value scheduledExecutorService=#scheduledExecutorService".

You can send messages to a number of Camel Components to achieve parallel processing and load balancing with components such as

  1. SEDA for in-JVM load balancing across a thread pool. The SEDA component provides asynchronous SEDA behavior, so that messages are exchanged on a BlockingQueue and consumers are invoked in a separate thread from the producer. Note that queues are only visible within a single CamelContext. If you want to communicate across CamelContext instances (for example, communicating between Web applications), see the VM component.
  2. The VM component provides asynchronous SEDA behavior, but differs from the SEDA component in that VM supports communication across CamelContext instances - so you can use this mechanism to communicate across web applications
  3. JMS or ActiveMQ for distributed load balancing and parallel processing. The JMS component allows messages to be sent to (or consumed from) a JMS Queue or Topic. The implementation of the JMS Component uses Spring's JMS support for declarative transactions, using Spring's JmsTemplate for sending and a MessageListenerContainer for consuming.
  4. The Splitter from the EIP (Enterprise Integration Patterns) patterns allows you split a message into a number of pieces and process them individually.  The split component takes the attribute "executorServiceRef". This attribute refers to a custom Thread Pool to be used for parallel processing, and notice that if you set this option, then parallel processing is automatically implied, and you do not have to enable that option as well. 

Note: As soon as you send multiple messages to different threads or processes you will end up with an unknown ordering across the entire message stream as each thread is going to process messages concurrently. For many use cases the order of messages is not too important. However for some applications this can be crucial, and you need to preserve the order.


Q. What is a SEDA?
A. SEDA stands for Staged Event Driven Architecture. This architecture decomposes a complex, event-driven application into a set of stages connected by queues. The most fundamental aspect of SEDA architecture is the programming model that supports stage-level backpressure and load management. Stage is analogous to "Event", to simplify the idea, think SEDA as a series of events sending messages between them.




Q. What does Apache Camel "event" component does?
A. The event component provides access to the Spring ApplicationEvent objects. This allows you to publish ApplicationEvent objects to a Spring ApplicationContext or to consume them. You can then use Enterprise Integration Patterns to process them such as Message Filter.

Labels: ,

Jun 27, 2013

Java unit testing with mock objects using frameworks like Mockito.

This is an extension to my blog post entitled  unit testing Spring MVC controllers for web and RESTful services with spring-test-mvc.  The purpose of this is to demonstrate the power of using mock objects via frameworks like Mockito.

This example tests a Spring MVC Controller used for servicing RESTful services, and this example illustrates a CSV file download.

 
@Controller
public class MyAppController
{

    @Resource(name = "aes_cashForecastService")
    private MyAppService myAppService;

    @RequestMapping(
            value = "/portfolio/{portfoliocd}/detailsPositionFeed.csv",
            method = RequestMethod.GET,
            produces = "text/csv")
    @ResponseBody
    public void getPositionFeedCSV(
            @PathVariable(value = "portfoliocd") String portfolioCode,
            @RequestParam(value = "valuationDate", required = true) @DateTimeFormat(pattern = "yyyyMMdd") Date valuationDate,
            HttpServletResponse response) throws Exception
    {
       
        try
        {
            
            PositionFeedCriteria criteria = new PositionFeedCriteria();
            criteria.setEntityCd(portfolioCode);
            criteria.setValuationDtTm(valuationDate != null ? valuationDate
                    : new Date());
            
            String fileName = "test.csv"
            
   //headers for file download
            response.addHeader("Content-Disposition", "attachment; filename="+ fileName);
            response.setContentType("text/csv");
            //returns results as CSV
            String positionFeedCSV = myAppService.getPositionFeedCSV(criteria);
            response.getWriter().write(positionFeedCSV);
            
        }
        catch (Exception e)
        {
           e.printStackTrace();
        }
        logger.info("Completed producing position feed CSV");
    }
 
  public void setMyAppService(MyAppService myAppService)
    {
        this.myAppService = myAppService;
    }
    
}


Now comes the junit class using the Mockito framework to test the MyAppController.

 
//...

import static org.mockito.Matchers.any;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when;

import org.junit.Before;
import org.junit.Test;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.MockitoAnnotations;

public class MyAppControllerTest
{
     private static final String PORTFOLIO_CODE = "abc123";
   private static final java.util.Date VALUATION_DATE = new java.util.Date();

     private MyAppService mockMyAppService;
  private MyAppController controller;

  @Mock
     HttpServletResponse response;
  
  @Before
     public void setup()
     {
        MockitoAnnotations.initMocks(this);
        controller = new MyAppController();
        mockMyAppService = mock(MyAppServiceImpl.class);
        controller.setMyAppService(mockMyAppService);
     }
  
  //The test case
  @Test
     public void testGetPositionFeedCSV() throws Exception
     {
        String str = "dummyCSV";
        //Set up behavior
        when(mockMyAppService.getPositionFeedCSV(any(PositionFeedCriteria.class))).thenReturn(str);
        when(response.getWriter()).thenReturn(writer);
        
        //Invoke controller
        controller.getPositionFeedCSV(PORTFOLIO_CODE, VALUATION_DATE, response);
        
        //Verify behavior
        verify(mockMyAppService, times(1)).getPositionFeedCSV(any(PositionFeedCriteria.class));
        verify(writer, times(1)).write(any(String.class));
     } 
}
   

The key point to note here is the ability of the mock objects to verify if a particular method was invoked and if yes, how many times was invoked. This is demonstrated with the last two lines with the verify statement. This is one of the key differences between using a mock object versus a stub. This is a common Java interview question quizzing the candidate's understanding of the difference between a mock object and stub.

Labels: , , ,

Jun 24, 2013

Excel spreadsheet to generate SQL

When you have some data in tabular (e.g. Excel spreadsheet) format and would like to insert into a database table, you need to write an SQL insert query. Manually writing SQL query for multiple records can be cumbersome. This is where Excel spreadsheet comes in handy as demonstrated below. A single SQL query can be copied down where the formulas get copied with incrementing column numbers.

The  Excel concatenate character & is used to achieve this. The $ means fix. $a1 means fix excel column A. When you copy the formula, the row numbers will be incremented like 2,3,4, etc, but the column will remain fixed to A. In the example below
  • $A$1 = first_name
  • $B$1 = surname
  • $C$1 = age

Note: Both column and row are prefixed with $, which means both are fixed.


The Excel formula is

 

="insert into person ("&$A$1&", "&$B$1&", "&$C$1&") values ('"&$A2&"','"&$B2&"',"&$C2&")"

The above Excel expression is easier to understand if broken down as shown below where the concatenation character & plays a major role in combining static text within quotes with dynamic formulas like $A$1.

 
"insert into person ("

&

$A$1

&

", "

&

$B$1

&

", "

$C$1

&

") values ('"

&

$A2

&

"','"

&

$B2

&

"',"

&

$C2

&

")"


The generated SQL statement will be
 

insert into person (first_name, surname, age) values ('Peter','Smith',35)


This SQL can be copied down in Excel to get SQL for all rows

 

insert into person (first_name, surname, age) values ('Peter','Smith',35)    
insert into person (first_name, surname, age) values ('John','Smith ',12)    
insert into person (first_name, surname, age) values ('Eddy','Wayne',32)    




You can create other SQL statements using the above technique from a table of data for better productivity.


If you have date column, the use the following formula to convert it to a text.


TEXT($D2,"dd/mm/yyyy")


You may also like:

Labels: , ,

OLAP (i.e. Data Warehousing) Vs. OLTP

Q. What is the difference between OLTP and OLAP?
A. OLTP stands for On-Line Transaction Processing and OLAP stands for On-Line Analytical Processing. OLAP contains a multidimensional or relational data store designed to provide quick access to pre-summarized data & multidimensional analysis.

  • MOLAP: Multidimensional OLAP – enabling OLAP by provding cubes.
  • ROLAP: Relational OLAP – enabling OLAP using a relational database management system



OLTP OLAP
Source data is operational data. This data is the source of truth. Data comes from various OLTP data sources as shown in the above diagram
Transactional and normalized data is used for daily operational business activities. Historical, de-normalized and aggregated  multidimensional data is used for analysis and decision making (i.e. for business intelligence).
Data is inserted via short inserts and updates. The data is normally captured via user actions via web based applications. Periodic (i.e. scheduled) and long running (i.e. during off-peak) batch jobs refresh the data. Also, known as ETL process as shown in the diagram.
The database design involves highly normalized tables. The database design involves de-normalized tables for speed. Also, requires more indexes for the aggregated data.
Regular backup of data is required to prevent any loss of data, monetary loss, and legal liability. Data can be reloaded from the OLTP systems if required. Hence, stringent backup is not required.
Transactional data older than certain period can be archived and purged based on the compliance requirements. The volume of this data will be higher as well due to its requirement to maintain historical data.
The typical users are operational staff. The typical users are management and executives to make business decisions.
The space requirement is relatively small if the historical data is archived. The space requirement is larger due to the existence of aggregation structures and historical data. Also requires more indexes than OLTP.

There are a number of commercial and open-source OLAP  (aka Business Intelligence) tools like:
  • Oracle Enterprise BI Server, Oracle Hyperion System
  • Microsoft BI & OLAP tools
  • IBM Cognos Series 10
  • SAS Enterprise BI Server
  • JasperSoft (open source)

The OLAP tools are well known for their drill-down and slice-and-dice functionality. Also they enable users to very quickly analyze data by nesting the information in tabular or graphical formats. They generally provide  good performance due to their highly indexed file structures (i.e. cubes) or in-memory technology. 


Q. What is an OLAP cube?
A. An OLAP cube will connect to a data source to read and process the raw data to perform aggregations and calculations for its associated measures. Cubes are the core components of OLAP systems. They aggregate facts from every level in a dimension provided in a schema. For example, they could take data about products, units sold and sales value, then add them up by month, by store, by month and store and all other possible combinations. They’re called cubes because the end data structure resembles a cube.


Labels: ,

Jun 21, 2013

JPA interview questions and answers and high level overview - part 2

This is a continuation of JPA and Hibernate interview Questions and answers part 1.

Q. What is an EntityManager?
A. The entity manager javax.persistence.EntityManager provides the operations from and to the database, e.g. find objects, persists them, remove objects from the database, etc. Entities which are managed by an EntityManager will automatically propagate these changes to the database (if this happens within a commit statement). These objects are known as persistent object.  If the Entity Manager is closed (via close()) then the managed entities are in a detached state. These are known as the detached objects. If you want synchronize them again with the database, the a Entity Manager provides the merge() method. Once merged, the object(s) becomes perstent objects again.

The EntityManager is the API of the persistence context, and an EntityManager can be injected directly in to a DAO without requiring a JPA Template. The Spring Container is capable of acting as a JPA container and of injecting the EntityManager by honoring the @PersistenceContext (both as field-level and a method-level annotation).

Here is a sample PersonDaoImpl using the Person entity

 
package com.mycompany.myapp.impl;

//...

import com.mycompany.myapp.model.Person

import java.util.ArrayList;
import java.util.List;

import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.persistence.Query;

import org.springframework.stereotype.Repository;

@Repository("personDao")
public class PersonDaoImpl implements PersonDao
{
    @PersistenceContext
    private EntityManager em;
    
    @Override    
    public List<Person> fetchPersonByFirstname(String fName)
    {
        Query query = em.createQuery( "from Person p where p.firstname = :personName", Person.class );
        query.setParameter("firstName", fName);
          
        List<Person> persons = new ArrayList<Person>();
        List<Person> results = query.getResultList();
        
        return persons;
    }
    
    public Person find(Integer id)
    {
        return em.find(Person.class, id);
    }  
}


Q. What us an Entity?
A. A class which should be persisted in a database it must be annotated with javax.persistence.Entity. Such a class is called Entity. An instances of the class will be a row in the person table. So, the columns in the person table will be mapped to the Person java object annotated as @Entity. Here is the sample Person class.




 
package com.mycompany.myapp.model;

//....

@SuppressWarnings("serial")
@Entity
@Table(name = "person", schema = "dbo", catalog = "my_db_schema")
@Where(clause = "inactiveFlag = 'N'")
@TypeDefs(
{
    @TypeDef(name = "IdColumn", typeClass = IdColumnType.class)})
public class UnitTrustPrice implements java.io.Serializable
{

    private int id;
    private String firstName;
 private String surname;
 private byte[] timestamp;
 private char inactiveFlag;

    public Person()
    {
    }
 
 //.. other constructors
 
 @Id
    @GeneratedValue(strategy = GenerationType.IDENTITY)
    @Column(name = "person_id", nullable = false, precision = 9, scale = 0)
    @Type(type = "int")
    public int getId()
    {
        return this.id;
    }
    
    public void setId(int id)
    {
        this.id = id;
    }
 
 @Version
    @Column(name = "timestamp", insertable = false, updatable = false, unique = true, nullable = false)
    @Generated(GenerationTime.ALWAYS)
    public byte[] getTimestamp()
    {
        return this.timestamp;
    }
    
    public void setTimestamp(byte[] timestamp)
    {
        this.timestamp = timestamp;
    }
 
 @Column(name = "InactiveFlag", nullable = false, length = 1)
    public char getInactiveFlag()
    {
        return this.inactiveFlag;
    }
    
    public void setInactiveFlag(char inactiveFlag)
    {
        this.inactiveFlag = inactiveFlag;
    }
    
 
 @Column(name = "first_name", nullable = false, length = 30)
    public String getFirstName()
    {
        return this.firstName;
    }
 
    public void setFirstName(String fName)
    {
        this.firstName = fName;
    }
 
 @Column(name = "surname", nullable = false, length = 30)
    public String getSurname()
    {
        return this.surname;
    }
 
    public void setSurname(String sName)
    {
        this.surname = sName;
    }
 
 //..........
} 




Q. What are the dependency jars required for a JPA application?
A Include the relevant jar files via your Maven pom.xml file.

 
<!-- JPA and hibernate -->
<dependency>
 <groupId>org.hibernate</groupId>
 <artifactId>hibernate-core</artifactId>
</dependency>
<dependency>
 <groupId>org.hibernate</groupId>
 <artifactId>hibernate-entitymanager</artifactId>
</dependency>
<dependency>
 <groupId>org.hibernate.javax.persistence</groupId>
 <artifactId>hibernate-jpa-2.0-api</artifactId>
</dependency>

<!-- validator -->
<dependency>
 <groupId>javax.validation</groupId>
 <artifactId>validation-api</artifactId>
</dependency>

<dependency>
 <groupId>org.hibernate</groupId>
 <artifactId>hibernate-validator</artifactId>
</dependency>




Labels: , ,

Jun 20, 2013

Notepad++ with power of regex as a productivity tool for developers

This is an extension to my post entitled Notepad++ plugin for viewing JSON data and other benefits to demonstrate the power of Notepad++ and regular expression.

This post is based on a industrial example where you have a CSV file where you want to add quotes (i.e. ") around it. For example

Convert text like:

 
BA555,04-May-2013,04-04-2013,12358,AUSTRALIAN BONDS,,86776.00,AUD,86776,Application,,Capital


To

"BA555","04-May-2013","04-04-2013","12358","AUSTRALIAN BONDS","","86776.00","AUD","86776","Application","","Capital"

Another reason you might want to do this in Java is to construct a String array or list as shown below.

       List<string> asList = Arrays.asList(new String[]
        {
            "BA555", "04-May-2013", "04-04-2013", "12358", "AUSTRALIAN BONDS", "", "86776.00", "AUD", "86776",
            "Application", "", "Capital"
        });

Now let's see how we can use notepad++ find/replace function to add quotes around each entry using its find/replace with regular expressions as shown below.


As you can see

Find what regular expression is:  ([^,]*)(,?) , which means 0 or more characters but "," as first group stored in "\1" followed by 0 or 1 "," (i.e. optional ,), and grouped as \2.

Replace with regular expression is: "\1"\2 where " is added then followed by \1, which is the value captured like BA555 and then followed by ", and followed by optional ",".

Similar approaches can be used for formatting other bulk records like removing spaces, adding quotes, replacing new line characters with tabs, replacing tabs with new lines, replacing commas with new lines, etc.

Here is an example for replacing "," with new lines.



The output will be

 
BA555
04-May-2013
04-04-2013
12358
AUSTRALIAN BONDS

86776.00
AUD
86776
Application

Capital

Labels: ,

Jun 19, 2013

JPA interview questions and answers and high level overview - part 1

Q. What is a JPA? What are its key components?
A. The process of mapping Java objects to database tables and vice versa is called "Object-relational mapping" (ORM). The Java Persistence API provides Java developers with an object/relational mapping  (ORM) facility for managing relational data in Java applications. JPA is a specification and several implementations are available like EJB, JDO, Hibernate, and Toplink. Via JPA the developer can map, store, update and retrieve data from relational databases to Java objects and vice versa. 


Q. What is the difference between hibernate.cfg.xml and persistence.xml?
A. If you are using Hibernate's proprietary API, you'll need the hibernate.cfg.xml. If you are using JPA i.e. Hibernate EntityManager, you'll need the persistence.xml. You will not need both as you will be using either Hibernate proprietary API or JPA. However, if you had used Hibernate Proprietary API using hibernate.cfg.xml with hbm.xml mapping files, and now wanted to start using JPA, you can reuse the existing configuration files by referencing the hibernate.cfg.xml in the persistence.xml in the hibernate.ejb.cfgfile property and reuse the existing hbm.xml files. In a long run, migrate hbm.xml files to JPA annotations.


Q. What is an EntityManagerFactory and a Persistence unit?
A. The EntityManager is created by the EntitiyManagerFactory which is configured by the persistence unit. The persistence unit is described via the file "persistence.xml" in the directory META-INF in the source folder. It defines a set of entities which are logically connected and the connection properties as shown below via an example.

 
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="1.0"
 xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_1_0.xsd">

 <persistence-unit name="myapp-server" transaction-type="RESOURCE_LOCAL">

  <provider>org.hibernate.ejb.HibernatePersistence</provider>
  
        <!-- import other mapping files if any -->
        <mapping-file>META-INF/myApp2.xml</mapping-file>

  <class>com.mycompany.myapp.Person</class>
  
  <exclude-unlisted-classes>true</exclude-unlisted-classes>

  <properties>
   <property name="javax.persistence.jdbc.driver" value="org.apache.derby.jdbc.EmbeddedDriver" />
            <property name="javax.persistence.jdbc.url" value="jdbc:derby:/home/myapp/databases/simpleDb;create=true" />
            <property name="javax.persistence.jdbc.user" value="test" />
            <property name="javax.persistence.jdbc.password" value="test" />
    
   <!-- force Hibernate to use a singleton of Ehcache CacheManager -->
   <property name="hibernate.cache.region.factory_class"
    value="org.hibernate.cache.ehcache.EhCacheRegionFactory" />
   <property name="hibernate.cache.use_second_level_cache" value="true" />
         <property name="hibernate.cache.use_query_cache" value="true" />
   <property name="net.sf.ehcache.configurationResourceName" value="ehcache.xml" />
  </properties>

 </persistence-unit>
</persistence>


Usually, JPA defines a persistence unit through the META-INF/persistence.xml file. Starting with Spring 3.1, this XML file is no longer necessary – the LocalContainerEntityManagerFactoryBean now supports a ‘packagesToScan’ property where the packages to scan for @Entity classes can be specified. The snippet below shows how you can bootstrap with or without persistence.xml.

 
package com.mycompany.app.config;

import java.util.HashMap;
import java.util.Map;

import javax.annotation.Resource;
import javax.persistence.EntityManagerFactory;
import javax.sql.DataSource;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.PropertySource;
import org.springframework.dao.annotation.PersistenceExceptionTranslationPostProcessor;
import org.springframework.orm.hibernate4.HibernateExceptionTranslator;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.JpaVendorAdapter;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;


@Configuration
@PropertySource(value =
{"classpath:/common/jpa.properties"
})
@EnableTransactionManagement
public class JpaConfig
{
    @Value("${my_app_common_jpa_showSql:false}")
    private Boolean showSql;
    
    @Value("${my_app_common_jpa_hibernateDialect}")
    private String hibernateDialect;
    
    @Value("${my_app_common_jpa_generateStatistics:false}")
    private Boolean generateStatistics;
    
    @Value("${my_app_common_jpa_generateDdl:false}")
    private Boolean generateDdl;
    
    @Value("${my_app_common_jpa_databasePlatform}")
    private String databasePlatform;
    
    @Resource(name = "myAppDataSource")
    private DataSource dataSource;
    
    private static Logger LOGGER = LoggerFactory.getLogger(JpaConfig.class);
    
    @Bean
    public Map<String, Object> jpaProperties()
    {
        LOGGER.debug("jpaProperties hibernateDialect=" + hibernateDialect);
        
        Map<String, Object> props = new HashMap<String, Object>();
        props.put("hibernate.dialect", hibernateDialect);
        props.put("hibernate.generate_statistics", generateStatistics);
        
        return props;
    }
    
    @Bean
    public JpaVendorAdapter jpaVendorAdapter()
    {
        LOGGER.debug("jpaProperties databasePlatform=" + databasePlatform);
        
        HibernateJpaVendorAdapter hibernateJpaVendorAdapter = new HibernateJpaVendorAdapter();
        hibernateJpaVendorAdapter.setShowSql(showSql);
        hibernateJpaVendorAdapter.setGenerateDdl(generateDdl);
        hibernateJpaVendorAdapter.setDatabasePlatform(databasePlatform);
        
        return hibernateJpaVendorAdapter;
    }
    
    @Bean
    public PlatformTransactionManager transactionManager()
    {
        return new JpaTransactionManager(entityManagerFactory());
    }
    
    @Bean
    public EntityManagerFactory entityManagerFactory()
    {
        LocalContainerEntityManagerFactoryBean lef = new LocalContainerEntityManagerFactoryBean();
        
        lef.setDataSource(dataSource);
        lef.setJpaPropertyMap(this.jpaProperties());
        lef.setJpaVendorAdapter(this.jpaVendorAdapter());
        lef.setPersistenceXmlLocation("META-INF/persistence.xml");
        //alternative without persistence.xml from Spring 3.1
        //lef.setPackagesToScan("com.mycompany.myapp");
        lef.afterPropertiesSet();
        
        return lef.getObject();
    }
    
    @Bean
    public PersistenceExceptionTranslationPostProcessor exceptionTranslation()
    {
        return new PersistenceExceptionTranslationPostProcessor();
    }
    
    @Bean
    public HibernateExceptionTranslator hibernateExceptionTranslator()
    {
        return new HibernateExceptionTranslator();
    }
}

The jpa.properties can be defined as shown below.

 
# properties for JPA
my_app_common_jpa_showSql=false
my_app_common_jpa_generateDdl=false

my_app_common_jpa_databasePlatform=SYBASE
my_app_common_jpa_hibernateDialect=org.hibernate.dialect.SybaseASE15Dialect
my_app_common_jpa_generateStatistics=true
my_app_common_aesJndiName=java:comp/env/jdbc/my_db

The configuration shown above is Java based config, and  if you want to use a Spring xml file based config, it will look like

 
<bean id="myEmf"
 class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
   <property name="dataSource" ref="dataSource" />
   <property name="packagesToScan" value="org.rest" />
   <property name="jpaVendorAdapter">
      <bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
         <property name="showSql" value="${my_app_common_jpa_showSql}" />
         <property name="generateDdl" value="${my_app_common_jpa_generateDdl}" />
         <property name="databasePlatform" value="${my_app_common_jpa_hibernateDialect}" />
      </bean>
   </property>
</bean>
 
<bean id="dataSource"
 class="org.springframework.jdbc.datasource.DriverManagerDataSource">
   <property name="driverClassName" value="${driverClassName}" />
   <property name="url" value="${url}" />
   <property name="username" value="${user}" />
   <property name="password" value="${password}" />
</bean>
 
<bean id="txManager" class="org.springframework.orm.jpa.JpaTransactionManager">
   <property name="entityManagerFactory" ref="myEmf" />
</bean>
<tx:annotation-driven transaction-manager="txManager" />

Labels: , ,

Jun 18, 2013

Advanced Apache Camel tutorial

This is an extension to the last two Apache Camel tutorials:
This advanced tutorial extends part-2. The first route (consumer) polls for csv files and once a file is found in the c:/temp/adv folder, a sequential route is created by placing the info to two in memory queues via the "direct" component. Firstly, direct:email consumes the exchange data and generates an email with empty body to notify user that a file has arrived. The second consummer direct:transfer receives the exchange, and transforms the document to another csv using the PersonMapper class depending on if the conversion succeeded or not, the file is  written to either converted or rejected sub folder. The key thing to make note is that how body and header information are passed back and forth between a n XML based routing in Spring and Java beans.

Step 1: The adv_file_converter.properties contains the name/value pairs

adv.in.dir=c:/temp/adv
adv.in.file.pattern=.*\.csv
adv.out.dir=c:/temp/adv
adv.converted.filename=Accepted_File_adv
adv.rejected.filename=Rejected_File_adv
adv.email.to=peter.smith@mycompany.com
adv.email.from=john.smith@mycompany.com

common.mail.host=mailhost.mycompany.net
common.mail.port=25

poll.delay=60000
initial.delay=10000

Step 2: Add the  camel-mail component to send email via smtp endpoints. Make sure that you have the following in the pom.xml.

     
    <dependency>
   <groupId>org.apache.camel</groupId>
   <artifactId>camel-core</artifactId>
   <version>2.10.4</version>
  </dependency>
  <dependency>
   <groupId>org.apache.camel</groupId>
   <artifactId>camel-spring</artifactId>
   <version>2.10.4</version>
  </dependency>
  <dependency>
   <groupId>org.apache.camel</groupId>
   <artifactId>camel-csv</artifactId>
   <version>2.10.4</version>
  </dependency>
  <dependency>
   <groupId>org.apache.camel</groupId>
   <artifactId>camel-mail</artifactId>
   <version>2.10.4</version>
  </dependency>

Step 3: The adv_file_route.xml file with the relevant routes defined.



 
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
 xmlns:p="http://www.springframework.org/schema/p" xmlns:c="http://www.springframework.org/schema/c"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:camel="http://camel.apache.org/schema/spring"
 xsi:schemaLocation="
  http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
        http://camel.apache.org/schema/spring http://camel.apache.org/schema/spring/camel-spring.xsd">

 <routeContext id="advFileRoute" xmlns="http://camel.apache.org/schema/spring">
  <route id="advFileConversion">
   <from
    uri="file://{{adv.in.dir}}?include={{adv.in.file.pattern}}&delay={{poll.delay}}&initialDelay={{initial.delay}}&move=archive/" />

   <multicast stopOnException="false">
    <to uri="direct:email" /> 
    <to uri="direct:transfer" />
   </multicast>
  </route>

  <route id="inMemory">
   <from uri="direct:email" />
   <camel:setHeader headerName="subject">
    <camel:simple>${file:name} has been received</camel:simple>
   </camel:setHeader>
      <!-- reset body to null otherwise file content will be sent via email body-->
   <camel:setBody>
    <constant></constant>
   </camel:setBody> 
   <to
    uri="smtp://{{common.mail.host}}?contentType=text/html&to={{adv.email.to}}&from={{adv.email.from}}" />
  </route>
  <route>
   <from uri="direct:transfer" />
   <unmarshal>
    <csv skipFirstLine="true" />
   </unmarshal>
   <to uri="bean:personMapper" />
   <marshal>
    <csv delimiter="," />
   </marshal>
   <convertBodyTo type="java.lang.String" />

   <choice>
    <when>
     <simple>${headers.status} == 'accepted'</simple>
     <to uri="file://{{adv.out.dir}}/converted?fileName={{adv.converted.filename}}_${headers.dest_file_suffix}" />
    </when>
    <when>
     <simple>${headers.status} == 'rejected'</simple>
     <to uri="file://{{adv.out.dir}}/rejected?fileName={{adv.rejected.filename}}_${headers.dest_file_suffix}" />
    </when>
   </choice>

  </route>
 </routeContext>

</beans>

Step 4: The PersonMapper class that transforms the input file.

 
package com.mycompany.app5;

import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;

import org.apache.camel.Exchange;
import org.springframework.stereotype.Component;

@Component("personMapper")
public class PersonMapper
{
    
    private static final String HEADER_STATUS = "status";
    private static final String HEADER_DEST_FILE_SUFFIX = "dest_file_suffix";
    
    enum Status
    {
        accepted, rejected
    };
    
    public Object service(List<List<String>> data, Exchange exchange)
    {
        List<Map<String, String>> results = new ArrayList<Map<String, String>>();
        
        Map<String, String> headerLine = new LinkedHashMap<String, String>();
        headerLine.put("surname", "surname");
        headerLine.put("firstname", "firstname");
        headerLine.put("age", "age");
        
        results.add(headerLine);
        
        boolean accepted = true;
        
        for (List<String> line : data)
        {
            Map<String, String> resultLine = new LinkedHashMap<String, String>();
            resultLine.put("surname", line.get(1));
            resultLine.put("firstname", line.get(0));
            resultLine.put("age", line.get(2));
            results.add(resultLine);
            
            if (line.get(1) == null || line.get(2) == null)
            {
                accepted = false;
            }
            
        }
        
        if (accepted)
        {
            exchange.getIn().setHeader(HEADER_STATUS, Status.accepted.name());
        }
        else
        {
            exchange.getIn().setHeader(HEADER_STATUS, Status.rejected.name());
        }
        
        String srcFileName = (String) exchange.getIn().getHeader("CamelFileNameOnly");
        exchange.getIn().setHeader(HEADER_DEST_FILE_SUFFIX, srcFileName);
        
        return results;
        
    }
}

Step 5: The main class that kicks off the standalone route.

 
package com.mycompany.app5;

import org.apache.camel.spring.Main;

public class StandAloneCamelWithSpring
{
    
    private Main main;
    
    public static void main(String[] args) throws Exception
    {
        StandAloneCamelWithSpring example = new StandAloneCamelWithSpring();
        example.boot();
    }
    
    private void boot() throws Exception
    {
        // create a Main instance
        main = new Main();
        // enable hangup support so you can press ctrl + c to terminate the JVM
        main.enableHangupSupport();
        main.setApplicationContextUri("applicationContext.xml");
        
        // run until you terminate the JVM
        System.out.println("Starting Camel. Use ctrl + c to terminate the JVM.\n");
        main.run();
        
    }
    
}

Step 6: The other relevant file "applicationContext.xml"

   
<beans xmlns="http://www.springframework.org/schema/beans"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:p="http://www.springframework.org/schema/p"
 xmlns:context="http://www.springframework.org/schema/context"
 xmlns:task="http://www.springframework.org/schema/task"
 xsi:schemaLocation="
    http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
    http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd
    http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task-3.0.xsd">

 <context:component-scan base-package="com.mycompany.app5" />

 <context:annotation-config />
 <context:spring-configured />

 <task:annotation-driven />

 <bean id="bridgePropertyPlaceholder"
  class="org.apache.camel.spring.spi.BridgePropertyPlaceholderConfigurer">
  <property name="locations">
   <list>
    <value>classpath:*.properties</value>
   </list>
  </property>
 </bean>


 <import resource="classpath*:route.xml" />
</beans>

Step 7: Finally, the route.xml file.

 
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
 xmlns:p="http://www.springframework.org/schema/p" xmlns:c="http://www.springframework.org/schema/c"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:util="http://www.springframework.org/schema/util"
 xmlns:camel="http://camel.apache.org/schema/spring"
 xsi:schemaLocation="
  http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
  http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd
        http://camel.apache.org/schema/spring http://camel.apache.org/schema/spring/camel-spring.xsd">

 <import resource="classpath*:adv_file_route.xml" />

 <camel:errorHandler id="defaultErrorHandler" type="DefaultErrorHandler" />

 <camelContext id="camelContext" xmlns="http://camel.apache.org/schema/spring"
  errorHandlerRef="defaultErrorHandler">
  <routeContextRef ref="advFileRoute" />
 </camelContext>

</beans>

The route definition file "adv_file_route.xml" is covered in Step 3.

Labels:

Jun 17, 2013

JUnit with Mockito tutorial

Unit testing is very important in development life cycle, and you can expect questions regarding mock objects and unit testing in general. This post extends the blog post relating to writing a mapper class for Apache Camel. This post writes a JUnit test with Mockito for the PersonMapper class.


Step 1: Have the relevant dependencies required to write unit tests.

 
<dependency>
 <groupId>junit</groupId>
 <artifactId>junit</artifactId>
 <version>4.10</version>
 <scope>test</scope>
</dependency>
<dependency>
 <groupId>org.mockito</groupId>
 <artifactId>mockito-core</artifactId>
 <version>1.8.5</version>
 <scope>test</scope>
</dependency>


Step 2: The class that is being unit tested. The PersonMapper.java



 
package com.mycompany.app5;

import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;

import org.apache.camel.Exchange;
import org.springframework.stereotype.Component;

@Component("personMapper")
public class PersonMapper
{
    
    private static final String HEADER_STATUS = "status";
    private static final String HEADER_DEST_FILE_SUFFIX = "dest_file_suffix";
    
    enum Status
    {
        accepted, rejected
    };
    
    public Object service(List<List<String>> data, Exchange exchange)
    {
        List<Map<String, String>> results = new ArrayList<Map<String, String>>();
        
        Map<String, String> headerLine = new LinkedHashMap<String, String>();
        headerLine.put("surname", "surname");
        headerLine.put("firstname", "firstname");
        headerLine.put("age", "age");
        
        results.add(headerLine);
        
        boolean accepted = true;
        
        for (List<String> line : data)
        {
            Map<String, String> resultLine = new LinkedHashMap<String, String>();
            resultLine.put("surname", line.get(1));
            resultLine.put("firstname", line.get(0));
            resultLine.put("age", line.get(2));
            results.add(resultLine);
            
            if (line.get(1) == null || line.get(2) == null)
            {
                accepted = false;
            }
            
        }
        
        if (accepted)
        {
            exchange.getIn().setHeader(HEADER_STATUS, Status.accepted.name());
        }
        else
        {
            exchange.getIn().setHeader(HEADER_STATUS, Status.rejected.name());
        }
        
        String srcFileName = (String) exchange.getIn().getHeader("CamelFileNameOnly");
        exchange.getIn().setHeader(HEADER_DEST_FILE_SUFFIX, srcFileName);
        
        return results;
        
    }
}


Step 3: The unit test itself with mock objects. The Exchange and Message objects from Apache Camel are mocked using the Mockito framework. Here is the PersonMapperTest class.

 
package com.mycompany.app5;

import static org.mockito.Mockito.doNothing;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;

import java.util.ArrayList;
import java.util.Arrays;
import java.util.LinkedHashMap;
import java.util.List;

import junit.framework.Assert;

import org.apache.camel.Exchange;
import org.apache.camel.Message;
import org.junit.Before;
import org.junit.Test;
import org.mockito.Mock;
import org.mockito.Mockito;

public class PersonMapperTest
{
    
    @Mock
    Exchange mockExchange;
    
    @Mock
    Message mockMessage;
    
    PersonMapper personMapper;
    
    @Before
    public void setup()
    {
        mockExchange = mock(Exchange.class);
        mockMessage = mock(Message.class);
        personMapper = new PersonMapper();
        
        when(mockExchange.getIn()).thenReturn(mockMessage);
        //invoking a void method
        doNothing().when(mockMessage).setHeader(Mockito.anyString(), Mockito.anyString());
        //mock return values
        when(mockMessage.getHeader("CamelFileNameOnly")).thenReturn("capitalexpenses_20130510.csv");
        
    }
    
    @Test
    public void testService()
    {
        @SuppressWarnings("unchecked")
        List<LinkedHashMap<String, String>> result = (List<LinkedHashMap<String, String>>) personMapper.service(
                getPersonRecords(), mockExchange);
        
        Assert.assertNotNull(result);
        Assert.assertEquals(2, result.size());
        LinkedHashMap<String, String> headerRec = result.get(0);
        LinkedHashMap<String, String> record = result.get(1);
        String headerStr = headerRec.values().toString();
        String recordStr = record.values().toString();
        
        Assert.assertEquals(
                "[surname, firstname, age]",
                headerStr
                );
        
        Assert.assertEquals(
                "[Smith, John, 35]",
                recordStr
                );
    }
    
    private List<List<String>> getPersonRecords()
    {
        List<List<String>> values = new ArrayList<List<String>>(5);
        
        List<String> asList = Arrays.asList(new String[]
        {
            "John", "Smith", "35"
        });
        values.add(asList);
        return values;
    }
}

Labels: , ,

Jun 14, 2013

The power of regular expression and testing tools like regexpal.com

Regular expressions are also known as regex, and they are very powerful and used widely in JavaScript, Unix scripting,  development tools, monitoring tools like Tivoli, and programming languages. So, it really pays to have good knowledge of regexes. This post is about a handy tool named "regexpal.com" that helps you not only learn regex, but also to quickly verify your regexes while working on a project. If you google for "regex online validator" you will find other similar online validating tools.



Step 1: Open up a browser and type regexpal.com on the address bar. Yo will get the online validator.

Step 2: Type or copy/paste the regex at the top window and the text to apply the regex on the second window.


As indicated in the footer, use RegexBuddy for a more powerful regex testing. The regex keywords are highlighted in blue. Also the matched phrase is also highlighted. In blue for the exact match and in yellow for any other string match.

The above example matches any string followed by one of "Monitoring", "MYAPP", or "CASHFORECAST". The "|" in regex means "OR". What if you want to literally match "Monitoring|MYAPP|CASHFORECAST"? You can escape "|" with "\" as shown below.


Now, the  "|" is not highlighted in blue as it is escaped.

Change the text as shown below. The "CASHFORECAST" is changed to "TRADEFORECAST" and match is no longer found as the text is not highlighted.

Labels: ,

Jun 13, 2013

Apache Camel and Spring with properties file to move files

This is an extension to the camel tutorials


This highlights  the use of a properties file to configure polling folder, destination folder, file pattern to look for, done file name, etc.

Step 1: The applicationContext.xml file is as shown below, and please note that "org.apache.camel.spring.spi.BridgePropertyPlaceholderConfigurer" is used configure the properties file.

  <beans xmlns="http://www.springframework.org/schema/beans"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:p="http://www.springframework.org/schema/p"
 xmlns:context="http://www.springframework.org/schema/context"
 xmlns:task="http://www.springframework.org/schema/task"
 xsi:schemaLocation="
    http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
    http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd
    http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task-3.0.xsd">

 <context:component-scan base-package="com.mycompany.app3" />

 <context:annotation-config />
 <context:spring-configured />

 <task:annotation-driven />

 <bean id="bridgePropertyPlaceholder"
  class="org.apache.camel.spring.spi.BridgePropertyPlaceholderConfigurer">
  <property name="locations">
   <list>
    <value>classpath:*.properties</value>
   </list>
  </property>
 </bean>

 <import resource="classpath*:route.xml" />
 
 </beans>





Step 2: The properties file myapp.propertieswill be something like

myapp.in.dir=C:/temp/simple/input
myapp.in.file.pattern=AK(TAX|PROPERTY|TRADE)\.CSV
myapp.out.dir=C:/temp/simple/output
myapp.out.file.pattern=GBST_${file:name.noext}_${date:now:yyMMdd}_${date:now:HHmm}.${file:name.ext}

Step 3: Note that properties from the myapp.properties are referred with "{{ }}" double curly brackets. The Apache camel file language expressions are denoted with ${} as shown above to extract file name, file extension, etc.

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
 xmlns:p="http://www.springframework.org/schema/p"
 xmlns:c="http://www.springframework.org/schema/c"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
 xmlns:camel="http://camel.apache.org/schema/spring"
 xsi:schemaLocation="
  http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
        http://camel.apache.org/schema/spring http://camel.apache.org/schema/spring/camel-spring.xsd">
    
 <routeContext id="myapp" xmlns="http://camel.apache.org/schema/spring">
  <route id="myappFilePolling" >
   <from uri="file:{{myapp.in.dir}}?include={{myapp.in.file.pattern}}&delay=5000&initialDelay=5000&readLock=rename&move=archive&moveFailed=error&doneFileName=${file:name}.END" />
   <to uri="file:{{myapp.out.dir}}?fileName={{myapp.out.file.pattern}}" />
  </route>
 </routeContext>

</beans>


Step 4: Finally, the route.xml.

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
 xmlns:p="http://www.springframework.org/schema/p" xmlns:c="http://www.springframework.org/schema/c"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:util="http://www.springframework.org/schema/util"
 xmlns:camel="http://camel.apache.org/schema/spring"
 xsi:schemaLocation="
  http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
  http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd
        http://camel.apache.org/schema/spring http://camel.apache.org/schema/spring/camel-spring.xsd">

 <import resource="classpath*:myapp.xml" />

 <camel:errorHandler id="defaultErrorHandler" type="DefaultErrorHandler" />

 <camelContext id="camelContext" xmlns="http://camel.apache.org/schema/spring"
  errorHandlerRef="defaultErrorHandler">

  <routeContextRef ref="myapp" />

  <dataFormats>
   <zip id="myZip" />
  </dataFormats>
 </camelContext>

</beans>

Jun 12, 2013

Scenario based question -- Designing a report or feed generation in Java



Scenario based  and open-ended questions like this can reveal a lot about your ability to design systems.


Q.If you have a requirement to generate a report or a feed file with millions of records pulled from the database, how will you go about designing it and what questions will you ask?
A.The questions to ask are:

  • How to display or provide the report. For example, online -- synchronously the user expects to see the report on the GUI or off-line -- asynchronously by sending the feed/report via an email or any other notification mechanisms like SFTP after generating the report in a separate thread.
  • Should we restrict the online reports for only last 12 months of data to minimize the report size and get better performance, and provide report/feed for data older than 12 months via offline processing.
  • Should we generate both online and offline reports asynchronously, and then for the online reports have the browser or GUI client to poll for report completion to display the results on the GUI. Alternatively can be emailed or downloaded via web at a later time.
  • What report generation framework to use like Jasper Reports, Open CSV, XSL-FO with Apache FOP, etc depending on the required output formats.
  • What is the source of truth for the report data -- database, RESTful web service call, XML, etc?
  • How to handle exceptional scenarios -- send an error email, use a monitoring system like Tivoli or Nagios to raise production support tickets, etc?
  • Security requirements. Are we sending feed/report with sensitive data via email? Do we need proper access control to restrict who can generate what for inline reports?
  • Should we schedule the offline reports to run during off peak? 
  • Archival and purging of the older reports. What is the report retention period for the requirements relating to auditing and compliance purpose? How big are the feed files and should they be gzipped?
 The above scenario can be implemented in a number of different ways.

Firstly, using a simple custom solution.

In this solution, a blocking queue and Java multi-threading (i.e an Executor framework) can be used to asynchronously produce a report. Alternatively, you can use asynchronous processing with Spring.


Secondly, an Enterprise Integration Framework

like Apache Camel can be used to create an asynchronous route. The high-level diagram of a possible solution using the Apache Camel. This framework is written to address the Enterprise Integration Patterns (i.e. EIP).




Apache Camel is awesome if you want to integrate several applications with different protocols and technologies.Spring Integration framework is another alternative. There are a number of tutorials on Apache Camel in this blog to get started as it will be a very handy skill to have to solve business problems and convince your potential employers.


Finally, using an Enterprise Service Bus (ESB) like web Methods, Tibco, Oracle Service Bus, Mule, etc. Mule is an open source ESB. There are pros and cons to each approach. More on these topics can be found at


Scenarios based questions are very popular with the good interviewers, and really pays to brush up. There are a number of different scenarios based questions and answers.

Labels:

Jun 10, 2013

Testing RESTful web services with the cURL command line tool



The modern web application development is full of RESTful web services, and it very handy to know a few tools to test the RESTful web services.



Q. What is a cURL command line tool?
A. RESTful web applications are widely used and developed in many languages including Java, and cURL is a command line tool to quickly test RESTful web service functionality. cURL is a Unix operating system based tool. Here are some examples,

Note that this is a HEAD request, and -X is used to define the HTTP method or verb like HEAD, GET, POST, PUT, etc. The following example shows testing a health check URL to see if the RESTful web service is up. "-i" is used to show the response headers. "-H" is used to pass the request headers with the request.

curl  -i -H Content-Type:application/json -X HEAD "http://localhost:8080/my-server/myapp/healthcheck"

Response:

HTTP/1.1 200 OK
Server: Apache-Coyote/1.1
Content-Length: 0
Date: Wed, 27 Mar 2013 23:54:38 GMT

Here is an example of a GET request:

curl -i -X GET -H Accept:application/json 'http://localhost:8080/my-server/myapp/person?first_name=john&last_nane=smith'

Response:

HTTP/1.1 200 OK
Server: Apache-Coyote/1.1
Cache-Control: no-cache
Content-Type: application/json
Transfer-Encoding: chunked
Date: Thu, 21 Mar 2013 01:39:22 GMT

{"id":100,"first-name":"John", "last-name:Smith", "title":"Mr."}


Note: The resource uri needs to be quoted if you pass in multiple query parameters separated by ‘&’. If you have spaces in the query values, you should encode them i.e. either use the ‘+’ symbol or %20 instead of the space. Also, For GET requests, the -X GET is optional.


Here is an example of a POST request: "-d" is used for the data to be posted

curl -i -H "Accept: application/json" -X POST -d "firstName=james" http://localhost:8080/my-server/myapp/persons/person

The PUT request is done very similar to a POST request as shown above, but with -X PUT. A POST request is used to create a new person record and a PUT request is made to edit an existing person record as shown below.

curl -i -H "Accept: application/json" -X PUT -d "firstName=john" http://localhost:8080/my-server/myapp/persons/person/100


Here is an example of a DELETE request:

curl -i -H "Accept: application/json" -X DELETE http://localhost:8080/my-server/myapp/persons/person/100


There are other good tools to test RESTful web services like the Poster plugin from Firefox Add-on. Another great tool that I have blogged about is the RESTClient stand alone jar.  These two are great GUI tools if you do not want to get down and dirty with cURL or if you are testing from Windows you could install Cygwin and then install and use cURL.

Labels: , , , ,

Unix interview Questions and answers: reading from a file

Unix for software developers

1.Top 17 Unix commands Java developers use frequently 2.History commands 3.Shell scripting 4. Reading from a file
5.Purging older files 6.Splitting and archiving files 7.Emulator, SSH client 8.Unix commands for developers 9.Unix basic interview Q&A

Q. If you have a shell script file or data file created in DOS, how will you convert it to Unix (especially, end of file and end of line characters)?
A. Using the dos2unix command.

dos2unix ReadWriteFile.sh ReadWriteFile.sh


Q. How do you remove the Control-M characters from a file?
A. Using the sed command that replaces Control-M with nothing

sed 's/^M//g' ReadWriteFile.sh > ReadWriteFileNew.sh

Note: The ^M is typed on the command line with ctrl+v and ctrl+M

Q. How will you go about reading a CSV file as shown below?

Portfolio,Account,PositionIndicator,Amount
AA123,CURR-AUD,CREDIT,2244.14000000
AA123,CURR-AUD,CREDIT,5.60000000
AA123,CURR-AUD,DEBIT,2249.74000000
AA123,CURR-GBP,CREDIT,0.01000000
AA123,CURR-GBP,DEBIT,0.01000000

A.

#!/bin/bash

# $# means number of arguments supplied to the command. In this case the file to to be read
if [ $# != 1 ]; then
    echo "Usage: $0 input-file"
    exit 1
else
    infile=$1
fi

# assign file descriptor 3 to input file
exec 3< $infile

# read till the end of the file
until [ $done ]
do
    read <&3 myline
 #$? is the exit code, and 0 means success and != 0 means not success. That is no line is read.
    if [ $? != 0 ]; then
        done=1
        continue
    fi

    # process file data line-by-line in here

    echo $myline
done


Q. How will you read and then write to another file?
A.

#!/bin/bash

# $# means number of arguments supplied to the command
if [ $# != 2 ]; then
    # $0 is the script name  $1 is input file and $2 is output file  
    echo "Usage: $0 input-file output-file"
    exit 1
else
    infile=$1
fi

# assign file descriptor 3 to input file
exec 3< $infile

# read till the end of the file
until [ $done ]
do
    read <&3 myline
 #$? is the exit code, and 0 means success and != 0 means not success. That is no line is read.
    if [ $? != 0 ]; then
        done=1
        continue
    fi
   
   # process file data line-by-line
   
    # append to output file. If you run it more than once, it keeps appending.
    echo $myline >> $2 
done

Q. How will you empty or clear the contents of a file?
A.

cat /dev/null > /drives/c/jpm_out/simple-out.csv

Q. How will you read the file and then write to another file with CSV data in  different order?

Account,Portfolio,PositionIndicator,Amount
CURR-AUD,AA123,CREDIT,2244.14000000
CURR-AUD,AA123,CREDIT,5.60000000
CURR-AUD,AA123,DEBIT,2249.74000000
CURR-GBP,AA123,CREDIT,0.01000000

A. Here is the revised shell script

#!/bin/bash

# $# means number of arguments supplied to the command
if [ $# != 2 ]; then
    # $0 is the script name  $1 is input file and $2 is output file  
    echo "Usage: $0 input-file output-file"
    exit 1
else
    infile=$1
fi

# assign file descriptor 3 to input file
exec 3< $infile

export IFS=","

# read till the end of the file
until [ $done ]
do
    # read 4 columns into a,b,c, and d
    read <&3 a b c d
 #$? is the exit code, and 0 means success and != 0 means not success. That is no line is read.
    if [ $? != 0 ]; then
        done=1
        continue
    fi
   
   # process file data line-by-line
   
    # append to output file. If you run it more than once, it keeps appending.
    echo $b,$a,$c,$d >> $2 
done


Labels:

Jun 9, 2013

Apache Camel JMX link to introspect routes, end points, components, etc at runtime



This is an extension to the last two Apache Camel tutorials:

The JMX based MBeans provided  by Apache Camel is useful for debugging and negative testings where you need to stop one or more routes. The JMX can be initiated via JConsole. We touched on using JConsole for
in the earlier blog posts.

Step 1: Start the "StandAloneCamelWithSpring" that you created in the tutorial  Apache Camel With Spring tutorial - Part 2. Once this route is running, you need to invoke the JConsole from a DOS command prompt.   
 

Step 2: Select "local Process"  and the "StandAloneCamelWithSpring" as highlighted above, and then click on "Connect".

 Step 3: You need to select the "MBeans" tab to see all the  MBeans, and then  one of the MBeans is the Apache Camel under which you have components, routes, consumers, context, endpoint, etc available at runtime.



Step 4: The operations to stop and start routes are performed via the "operations" option as  highlighted below.




JConsole can be used to connect to other remote JVM processes by providing hostname, port, and login credentials.

By default, JMX instrumentation agent is enabled in Camel. This means that Camel runtime creates and registers MBean management objects with a MBeanServer instance in the VM. This allows Camel users instantly obtain insights into how Camel routes perform down to the individual processor's level.

The domain name of the MBean object can be configured by Java VM system property:

-Dorg.apache.camel.jmx.mbeanObjectDomainName=your.domain.name

Or, by adding a jmxAgent element inside the camelContext element in Spring configuration:

 
<camelContext id="camel" xmlns="http://camel.apache.org/schema/spring">
  <jmxAgent id="agent" mbeanObjectDomainName="your.domain.name"/>
    ...
</camelContext> 



JMX can be disabled

<camelContext id="camel" xmlns="http://camel.apache.org/schema/spring">
  <jmxAgent id="agent" disabled="true"/>
    ...
</camelContext>

or

CamelContext camel = new DefaultCamelContext();
camel.disableJMX();


Labels: ,