How to Connect a Database with Hazelcast Using Hikari Connection Pooling?

Overview

Hazelcast has provided feature intefaces MapStore, MapLoader and QueueStore to store and load the data from/into NoSQL or RDBMS Database store. Hazelcast has internal dependency of zaxxer.hikari Hazelcast Hikari connection pooling feature. You can implement the Hikari connection pooling feature using these classes.

Hazelcast Hikari connection pooling

Let’s implement the Hazelcast Hikari connection pooling using H2 database with Hazelcast community version. We are using maven base project for demo.

Hazelcast Persistence Project Structure

Here we are using Intellij to create maven base project, below are the project structure to implement Hazelcast Persistance layer.

Project Structure

Hazelcast Maven Dependency

Add the Hazelcast and H2 database dependencies into pom.xml

<dependency>
   <groupId>com.hazelcast</groupId>
   <artifactId>hazelcast</artifactId>
   <version>5.4.0</version>
</dependency>

<dependency>
   <groupId>com.h2database</groupId>
   <artifactId>h2</artifactId>
   <version>2.2.224</version>
   <scope>runtime</scope>
</dependency>

Hazelcast Hikari Connection Pooling

To implement Hazelcast Hikari connection pooling you need to create datasource com.hazelcast.shaded.com.zaxxer.hikari.HikariDataSource which is available internally in Hazelcast Hikari framework dependency.

1. Create ConnectionPool interface to get the connection from Hikari Connection Pool

public interface ConnectionPool {
    Connection getConnection();
}

2. Create another HikariDataSourcePool class to implement Hazelcast Hikari connection pooling.

public class HikariDataSourcePool implements ConnectionPool {
  public HikariDataSourcePool() {
    //Initialize the connection pooling here
  }

  @Override
  public Connection getConnection() {
  }
}

3. Create reference of com.hazelcast.shaded.com.zaxxer.hikari.HikariConfig and com.hazelcast.shaded.com.zaxxer.hikari.HikariDataSource, you can initialize both into constructor.

private static HikariDataSource hikariDataSource = null;
private static HikariConfig hikariConfig;

public HikariDataSourcePool() {
   if (null != hikariDataSource) {
      System.out.println("*** Hey! Mr. X HikariDataSource already initialized.. existing connection can be used.");
   } else {
    createHikariDataSource();
   }
}

private void createHikariDataSource() {
    hikariConfig = new HikariConfig();
    hikariConfig.setJdbcUrl("jdbc:h2:mem:testdb");
    hikariConfig.setUsername("sa");
    hikariConfig.setPassword("");
    hikariConfig.setMaximumPoolSize(5);
    hikariConfig.setIdleTimeout(30000);
    hikariConfig.setConnectionTimeout(30000);
    hikariConfig.setPoolName("Demo-POOL");
    hikariConfig.setDriverClassName("org.h2.Driver");
   hikariDataSource = new HikariDataSource(hikariConfig);

   log.info("Datasource Created..");
}

In the above code snippet HikariConfig can be initialized with H2 database configurations. Post connection config it will initialize the instance of HikariDataSourceand create a pool of connections.

Database Connection Config.

  • JdbcUrl â€“ database connection URL with port and database name
  • Username â€“ user credential of database
  • Password â€“ database password
  • DriverClassName â€“ driver class name, popular datasource driver class name
  • PoolName â€“ Connection pool name
  • MaximumPoolSize â€“ Max number of database connection created on pool
  • IdleTimeout â€“ Timeout value for idle connections
  • ConnectionTimeout â€“ Timeout value for connection attempts

4. @Override the getConnection(), this method will be responsible to get the database connection from pool.

@Override
    public Connection getConnection() {
        try {
            if (null != hikariDataSource) {
                System.err.println("\nGetting....! SQL Connection from HIKARI POOL.\n");
                return hikariDataSource.getConnection();
            } else {
                throw new DatabaseSQLException("Ops! Hikari datasource not available.");
            }
        } catch (SQLException e) {
            throw new DatabaseSQLException("Exception while creating database connection." + e);
        }
    }

5. Complete connection pooling datasource class HikariDataSourcePool implementation

@Slf4j
public class HikariDataSourcePool implements ConnectionPool {
    private static HikariDataSource hikariDataSource = null;

    private static HikariConfig hikariConfig;

    public HikariDataSourcePool() {
        if (null != hikariDataSource) {
            System.out.println("*** Hey! Mr. XYZ HikariDataSource already created.. existing connection can be used.");
        } else {
            createHikariDataSource();
        }
    }

    private void createHikariDataSource() {
        hikariConfig = new HikariConfig();

        hikariConfig.setJdbcUrl("jdbc:h2:mem:testdb");
        hikariConfig.setUsername("sa");
        hikariConfig.setPassword("");
        hikariConfig.setMaximumPoolSize(5);
        hikariConfig.setIdleTimeout(30000);
        hikariConfig.setConnectionTimeout(30000);
        hikariConfig.setPoolName("Demo-POOL");
        hikariConfig.setDriverClassName("org.h2.Driver");

        hikariDataSource = new HikariDataSource(hikariConfig);

        log.info("Datasource Created..");
    }

    @Override
    public Connection getConnection() {
        try {
            if (null != hikariDataSource) {
                System.err.println("\nGetting....! SQL Connection from HIKARI POOL.\n");
                return hikariDataSource.getConnection();
            } else {
                throw new DatabaseSQLException("Ops! Hikari datasource not available.");
            }
        } catch (SQLException e) {
            throw new DatabaseSQLException("Exception while creating database connection."+e);
        }
    }
}

Integrate Hazelcast Hikari Connection Pooling with MapStore and MapLoader

Once the connection pool is created, now you can integrate the pool with MapStore, MapLoader and QueueStore. In the below demo we have implemented it with MapStore and MapLoader.

In a demo we are using Employee database table in H2 database, where CRUD operations can be implemented by MapStore factory implementation of EmployeeMapStoreFactory class.

1. Below email model will be mapped with Employee database table.

@Builder
@Getter
@ToString
public class Employee {
    private Integer empId;
    private String firstName;
    private String lastName;
    private String email;
    private String phoneNo;
    private Double salary;
}

2.  You can abstract the MapStore implementation and initialize the connection pool instance here. Required methods can be overridden into subclass of  AbstractDataStoreFactory.

public abstract class AbstractDataStoreFactory<K, V> 
    implements MapDataStore<K, V>, MapLoader<K, V>, MapStore<K, V> {
}

3. Override and Implement the required methods, which will be responsible to store/load the data from/to H2 database with implementation of Hazelcast map CRUD operations MapStore factory.

Get database connection from Hikari Connection pool and create a PreparedStatement from connection and perform Hazelcast operations.

@Slf4j
public class EmployeeMapStoreFactory extends AbstractDataStoreFactory<Integer, Employee> {
    private ConnectionPool pool;

    public EmployeeMapStoreFactory() {
        pool = new HikariDataSourcePool();
    }

    @Override
    public Iterable<Integer> loadAllKeys() {
        String query = "SELECT EMPID FROM EMPLOYEE";

        List<Integer> empIds = new ArrayList<>();
        try (Connection connection = pool.getConnection();
             PreparedStatement preparedStatement = connection.prepareStatement(query);
             ResultSet resultSet = preparedStatement.executeQuery()) {
            while (resultSet.next()) {
                empIds.add(resultSet.getInt(1));
            }
        } catch (SQLException exception) {
            throw new DatabaseSQLException("Error on load all keys : " + exception);
        }

        return empIds;
    }

    @Override
    public Employee load(Integer empId) {
        String query = "SELECT EMPID, FIRSTNAME, LASTNAME, EMAIL, SALARY FROM EMPLOYEE WHERE EMPID=?";
        Employee.EmployeeBuilder employeeBuilder = Employee.builder();
        try (Connection connection = pool.getConnection();
             PreparedStatement preparedStatement = connection.prepareStatement(query);) {
            preparedStatement.setInt(1, empId);
            ResultSet resultSet = preparedStatement.executeQuery();

            if (resultSet.next()) {
                employeeBuilder
                        .empId(resultSet.getInt(1))
                        .firstName(resultSet.getString(2))
                        .lastName(resultSet.getString(3))
                        .email(resultSet.getString(4))
                        .salary(resultSet.getDouble(5));
            }
        } catch (SQLException exception) {
            throw new DatabaseSQLException("Error on load key : " + exception);
        }

        return employeeBuilder.build();
    }

    @Override
    public Map<Integer, String> loadAll(Collection collection) {
        log.info("Load all employee..");

        List<Integer> employees = (List<Integer>) collection;

        return employees.stream().collect(Collectors.toMap(id -> id, id -> load(id).toString()));
    }

    @Override
    public void store(Integer integer, Employee employee) {
        String storeQuery = "INSERT INTO EMPLOYEE(EMPID, FIRSTNAME, LASTNAME, EMAIL, SALARY) VALUES(?, ?, ?, ?, ?)";
        try (Connection connection = pool.getConnection();
             PreparedStatement preparedStatement = connection.prepareStatement(storeQuery)) {
            preparedStatement.setInt(1, employee.getEmpId());
            preparedStatement.setString(2, employee.getFirstName());
            preparedStatement.setString(3, employee.getLastName());
            preparedStatement.setString(4, employee.getEmail());
            preparedStatement.setDouble(5, employee.getSalary());

            preparedStatement.executeUpdate();

        } catch (Exception exception) {
            log.error("Exception : {}", exception);
            throw new DatabaseSQLException(exception.getMessage());
        }
    }

    @Override
    public void storeAll(Map<Integer, Employee> map) {
        //map.forEach(this::store);
        //or

        String storeQuery = "INSERT INTO EMPLOYEE(EMPID, NAME, LASTNAME, EMAIL, SALARY) VALUES(?, ?, ?, ?, ?)";
        try (Connection connection = pool.getConnection();
             PreparedStatement preparedStatement = connection.prepareStatement(storeQuery)) {
            map.forEach((identity, employee) -> {
                try {
                    preparedStatement.setInt(1, employee.getEmpId());
                    preparedStatement.setString(2, employee.getFirstName());
                    preparedStatement.setString(3, employee.getLastName());
                    preparedStatement.setString(4, employee.getEmail());
                    preparedStatement.setDouble(5, employee.getSalary());
                } catch (SQLException e) {
                    System.out.println(e);
                }
            });

            int[] batchResults = preparedStatement.executeBatch();

            // Handle the results if needed
            for (int result : batchResults) {
                //TODO: add this if needed
            }
        } catch (SQLException exception) {
            log.error("Exception : {}", exception.getMessage());
            throw new DatabaseSQLException(exception.getMessage());
        }
    }

    @Override
    public void delete(Integer integer) {

    }

    @Override
    public void deleteAll(Collection<Integer> collection) {

    }
}

Note â€“ Build the above MapStore factory EmployeeMapStoreFactory class and copy into Hazelcast bin/user-lib folder.

Hazelcast Configuration

Configure the MapStore factory EmployeeMapStoreFactory class into hazelcast.xml with below map configuration.

<hazelcast>
       
   ...
        
   <map name="empaloyee_map">
            
     <map-store enabled="true">
            
        <class-name>com.javatecharc.demo.maps.EmployeeMapStoreFactory</class-name>
     </map-store>
  </map>
      
  ...
 
</hazelcast>

Conclusion

Integrating HikariCP with Hazelcast for database connectivity can greatly enhance the performance and scalability of your distributed applications. By efficiently managing database connections, you ensure that your system remains responsive even under heavy load. Following the steps outlined above will help you set up this integration smoothly, leading to a more resilient and high-performing application. The above example persistance code available on github.

1 thought on “How to Connect a Database with Hazelcast Using Hikari Connection Pooling?”

  1. Pingback: How to integrate Hazelcast with Spring Boot? - Java Tech ARC 3i

Leave a Comment

Your email address will not be published. Required fields are marked *

Index
Scroll to Top