Convert Clob Type to String In Java

An SQL clob to used to store large amount of textual data in the database.The sql clob interface represents the clob data type.

In this tutorial I am going to explain how to convert clob to string in Java.

In an ideal scenario you read the clob data from the database and convert it to String and display it on console and may be display it on a UI(User Interface)

Steps to convert Clob to String

  • Read the clob value using getCharacterStream() method.
Reader r = clobData.getCharacterStream();
  • Read each character one by one from the retrieved stream of characters and append it to string buffer.
StringBuffer buffer = new StringBuffer();
int ch;
while ((ch = r.read())!=-1) {
buffer.append(""+(char)ch);
}
  • Display the obtained String

Example:-

Happy Learning 🙂

Object Serialization – Saving Objects in Java

Serialization is the process of converting the state of an object into byte stream.After the object is serialized has been written into a file, it can be read from the file and deserialized that is bytes that represent the object and its data can be used to recreate the objects in memory.

Writing Serialized Object to a file

Step-1
Implement Serializable interface in your class. Serializable interface is known as a marker interface because the interface doesn’t have any methods to implement.Its only purpose is to announce that the class implementing it is serializable.If any superclass of a class is serializable the subclass is automatically serializable even if the subclass doesn’t explicitly implement serializable.

Step-2
Make FileOutputStream and ObjectOutputStream

FileOutputStream fileStream = new FileOutputStream(“MyGame.ser”);

ObjectOutputStream os =  new ObjectOutputStream(fileStream);

Step -3
Write the object

os.writeObject(obj1);

os.writeObject(obj2);

Step-4

Close the stream

os.close();

Serialized objects save the values of the object’s instance variables, so that an identical instance object can be brought back to life when object is deserialized.

When an object is serialized the entire object graph is saved to the file that is all the instance variables, all the objects those objects refer to and all the objects those objects refer to and so on .And all of it happens automatically.

Transient Variables and Serialization

If you want an instance variable to be skipped during serialization, mark them transient.During deserialization the transient variable are set to their initial values 

Reading Serialized Object from a file

Deserialization is like serialization process in reverse, restoring an object to life in the same state in which it was saved to file ,except for the transient variables which comes back either null or as default primitive values.

Step-1

Make FileInputStream and ObjectInputStream
FileOutputStream fileStream = new FileOutputStream(“MyGame.ser”);

ObjectOutputStream os = new ObjectOutputStream(fileStream

Step-2

Read the Objects
Object one = os.readObject();

Object two = os.readObject();

Step-3

Cast the objects to their type

Step-4

Close the ObjectInputStream

os.close();
Please note that the static variables are not serialized.Static variables are not saved and when an object is deserialized it will have static variable its class currently has 

Serialization Example

Writing Objects to a file

Reading Objects from a file

Executing the serializable example

Please note that the default location of Person.ser file will be under the src folder

I hope you guys enjoyed reading this article.Happy Coding everyone 🙂

How to create a Batch Job using Spring Batch

Batch processing means executing a series of jobs and Spring provides an open source framework for batch processing. Spring provides a list of classes /API to read/write resources , job processing ,job restart and partitioning techniques to high volumes of data.

Spring Batch Components

Spring Batch consists of the following component:-

Job :- It represents the Spring Batch Job .Each job can have one or more steps.

Step:- Each step consists of a task which needs to be done.For instance reading from a csv file.

ItemReader:- Each step typically has one ItemReader.It reads input data (that is a file ) and provides the data sequentially ,one by one.For instance, it reads the data from a csv file and provides the records in the file as a list.

ItemProcessor :- ItemProcessor modifies/transforms the data ,one item at a time.Each step can have one ItemProcessor.

ItemWriter:- ItemWriter writes the data ,one item at a item.For instance, it writes the data read by item reader from a csv file to the database.Each step typically has one ItemWriter.

JobRepository:- It stores details of the batch job.It basically stores metadata about configured and executed jobs.It provides CRUd operations for JobLauncher, Job and Step instantiations.

JobLauncher:- Job is executed by the JobLauncher

JobInstance:- Each job may be associated with multiple job instances , each of the instance is uniquely identified by Job Parameters.

JobExecution:- Each run of a JobInstance is referred to JobExecution.It basically keep track of the job status ,start and end times etc.

Now that we are familiar with the components in a Spring batch,I will show you how to create a spring batch job .

Our use case is to read from a csv file, process the data in the file and write the data to the database.

Prerequisites

The csv file consisting of first name,last name,department,email,age.

Database Table with the following fields:-

  • Full Name
  • Email
  • Department

Create Table Script:-

CREATE TABLE Student (
studentId Number(13,0),
department varchar(250),
email varchar(250),
fullName varchar(250)
);

Project Structure

Maven dependencies

Add the following dependencies in the pom file

Csv File to be read

Creating ItemReader

In the reader we define the columns that are to be read from the csv file. It basically defines the mapping of the csv file to a domain object.In this case the columns read are mapped to the Student Object.

Creating ItemProcessor

In the ItemProcessor I am concatenating the first name and last name and creating a full name .

So, the csv file is read by the ItemReader and mapped to the Student Object.The processor modifies or processes the data.It basically contains any business logic to modify the data.

Creating ItemWriter

Dao File to save the data in the database

The database details are provided in the property files.

Creating Batch Job Configuration

Executing the Spring batch job

To execute the Spring Batch Job ,we are using the JobLauncher .It takes the job to be executed and the job parameters associated with them.We can schedule the job to run at a specific time interval based on a cron job through @Scheduled annotation.

In this case the cron entry is read from the property file.

After the job is executed the csv file is read and entries are made in the database.

I hope you enjoyed reading this article.Happy Coding 🙂

ConcurrentHashMap in Java

ConcurrentHashMap is an alternative to HashTable and synchronizedMap. It is used in multi threaded applications and provided better performance when compared to HashTable and synchronisedMap.

ConcurrentHashMap Implementation In Java

In concurrentHashMap we have new feature called concurrency level . The ConcurrentHashMap is divided into segments based on the concurrency level.We need to be careful while providing the concurrency level as this impacts the performance of the map. The default concurrency level is 16 which means 16 threads can access the map simultaneously when each thread is operating on a different section, the map is basically divided in 16 segments and each segment is governed with a different lock. This mechanism boosts the performance of the ConcurrentHashMap in a thread safe environment. But the update operations like put(), remove() , putAll() or clear() are not synchronized and may not reflect the most recent change in the map.

Features in ConcurrentHashMap

  • It doesn’t not allow null key and values.
  • The iterator in ConcurrentHashMap is fail safe which means it doesn’t throw ConcurrentModificationException if the map is updated while iteration.
  • The update operations in ConcurrentHashMap are thread safe which means at a time only one thread can update a segment of the concurrentHashmap.
  • Any number of thread can perform the  read operations without locking the map.
  • Since , concurrentHashMap do not blocks only a portion of the map, there is a chance of read overlapping with the update operation.In that case the result returned by get() method will only return the recently completed operation.
  • ConcurrentHashMap is not ordered.

When to use ConcurrentHashMap

It should be used when there are more reader threads than writer threads . If writer threads are equal to reader threads are more than reader thread then the performance of ConcurrentHashMap is same as the performance of Hashtable or synchronizedMap.

References:- https://docs.oracle.com/javase/8/docs/api/java/util/concurrent/ConcurrentHashMap.html

I hope you liked this article.Happy Coding 🙂

Fail fast & Fail safe Iterators in Java

Iterators are basically used to traverse over a set of objects one by one. Java supports two types of Iterators 

  • Fail Fast Iterator
  • Fail Safe Iterator

Fail Fast Iterator

Fail fast iterator do not allow any modifications to a collection while iterating/traversing the collection.By modification I mean any updating, addition of elements in the collection.If you try to modify the collection during iteration ,ConcurrentModificationException is thrown.

In the above code we are trying to add a new element in the map. The fail fast iterator throws ConcurrentModificationException if a new element is added while iterating the Collection.

Similarly, if you try to remove an element from the map (map.remove() method) while iterating the collection ,ConcurrentModificationException is thrown.

However, if you try to remove an element from the Collection while iterating it ( iterator.remove() method) , no exception is thrown.

Fail Safe Iterator

Fail safe iterator allow modifications to a collection while iterating the collection.You can do any updating, removal  or addition in a collection while traversing the collection without any exceptions being thrown.The reason being the iteration is performed on the clone/copy of the collection not on the actual collection so the modification done of the actual collection go unnoticed by the Iterator.This means if the collection is modified while traversal you might see stale values during iteration .So,basically the iteration is weakly consistent.

I hope you enjoyed reading this article.Happy Coding 🙂