start portlet menu bar

HCLSoftware: Fueling the Digital+ Economy

Display portlet menu
end portlet menu bar
Close
Select Page

Marketers these days are more involved and skilled with using data than ever. The customer information is gathered, stored, and retrieved from a central database, and it is increasing exponentially. It helps facilitates the marketers to make more specific and targeted marketing communications and create more personalized experiences for the users. But database management and handling is no easy job, and Cloud databases and Database as a Service (DaaS) platforms are becoming more popular in enterprises. Amazon Redshift is a known name among them.

What is Amazon Redshift?

Amazon Redshift is a fully managed petabyte-scale cloud-based data warehouse product designed for large scale data set storage and analysis. The current trend in the market is to use cloud-based database service, and the HCL Unica Campaign is well-versed to have integration with such services. In this article, we are going to find how easy it is to configure the HCL Unica Campaign to make use of Amazon Redshift as a user database using Amazon ODBC Driver.

HCL Unica Campaign with Amazon Redshift Database

What do you need as a Pre-requisite?

You would need below pre-requisites to start configuring Unica with the Amazon Redshift database:

HCL Unica Campaign application, which needs to be integrated with Amazon Redshift Database.

  1. Amazon Redshift Details: Cluster Name, Database Name, User Id, Password (You will get all these details when you purchase a contract for Amazon Redshift.)
  2. In Unica UI, under Settings->Configuration, under node “Affinium|Campaign|partitions|partition1|dataSources”, make sure that a datasource template for “(PostgreSQLTemplate)” is already added and you should be able to create a datasource using this template.
  3. If HCL Unica suite is installed on Unix based OS, install unixODBC 2.3.x on the server on which Unica Campaign listener is installed.

Supported ODBC Drivers

To integrate Unica Campaign with Amazon Redshift Database, you can either use PostGreSQL ODBC Driver or Amazon ODBC Driver. To better perform and to avail all the features related to Database, it is recommended to use Amazon ODBC Driver. Ideally, you should use the recent ODBC driver available from Amazon.

The older ODBC drivers are not supported by the Amazon itself. So, you need to make sure that you are using a supported Amazon ODBC driver version. Currently, Amazon ODBC v1.4.11.1000 is available and can be integrated with Unica Campaign. This needs to be installed and configured on the Server on which Unica Campaign listener is running.

How to install Amazon ODBC Drivers

You can refer to the below link published by Amazon to download and install the Amazon ODBC Driver.

https://docs.aws.amazon.com/redshift/latest/mgmt/configure-odbc-connection.html

Here is a list of commands which you can use to install an Amazon ODBC driver.

RHEL Operating System

wget https://s3.amazonaws.com/redshift-downloads/drivers/odbc/1.4.11.1000/AmazonRedshiftODBC-64-bit-1.4.11.1000-1.x86_64.rpm
yum --nogpgcheck localinstall AmazonRedshiftODBC-64-bit-1.4.11.1000-1.x86_64.rpm

Suse Linux Operating System

wget https://s3.amazonaws.com/redshift-downloads/drivers/odbc/1.4.11.1000/AmazonRedshiftODBC-64-bit-1.4.11.1000-1.x86_64.rpm
zypper install AmazonRedshiftODBC-64-bit-1.4.11.1000-1.x86_64.rpm

It will be installed under /opt/amazon by default.

Windows Operating System

You can download and install the .msi file from the below link.

https://s3.amazonaws.com/redshift-downloads/drivers/odbc/1.4.11.1000/AmazonRedshiftODBC64-1.4.11.1000.msi

How to configure odbc.ini file?

In a non-Windows server, you would need to create a file with the name odbc.ini. Below is a sample of an odbc.ini file.

[AMAZONREDSHIFT]
Driver=/opt/amazon/redshiftodbc/lib/64/libamazonredshiftodbc64.so
Host=unica-redshift-cluster.redshift.amazonaws.com
Port=5439
Database=amazondb
Username=awsuser
Password=Password
locale=en-US
BoolsAsChar=0

Note that, you would need to change the entries for Host, Port, Database, Username, and Password in the above example as per you have received from Amazon.

On Windows OS, you need to add the System DSN under “ODBC Datasource Administrator (64-bit)” by referring to below link:

https://docs.aws.amazon.com/redshift/latest/mgmt/configure-odbc-connection.html#install-odbc-driver-windows

In the above example, AMAZONREDSHIFT is the DSN name.

Changes required on Unica UI

1. Create a Datasource under “Affinium|Campaign|partitions|partition1|dataSources” using template – PostgreSQLTemplate.

2. For the added Datasource, keep the value of field DSN as same as what you have defined in the odbc.ini file for non-Windows OS and the name of System DSN added in “ODBC Datasource Administrator (64-bit)” for Windows OS. If you have set odbc.ini as given in above example, DSN value has to be set to AMAZONREDSHIFT.

3. Add Datasource credentials under User – asm_admin OR the user-defined under field ASMUserForDBCredentials.

4. For non-Windows OS, Under <Campaign Home>/bin directory, edit the setenv.sh file and make sure you append the path /opt/amazon/redshiftodbc/lib/64 to LD_LIBRARY_PATH env variable. Also, make sure that the env variable ODBCINI is set to the absolute path of odbc.ini file.

Example

HCL Unica Campaign with Amazon Redshift Database

To Test the Connectivity

For testing connectivity to Amazon Redshift Database, you can use utilities like cxntest, odbctest available under <Campaign Home>/bin directory.

Using cxntest

1. Go to <Campaign Home>/bin directory from the command prompt from the Server on which the Campaign listener is installed.

2. Run the setenv.sh/setev.bat.

3. Run cxntest utility.

4. Provide “libodb4dDD.so” or “libodb4d.so” for “Connection Library?” prompt.

5. Provide the Datasource Name, UserName, and Password as per your configuration. If you see a prompt “>,” it indicates that you are connected to the database successfully.

[root@server bin]# ./cxntest
Connection Library? libodb4dDD.so
Registered Data Sources:
    Data Sources
    AMAZONREDSHIFT
Data Source? AMAZONREDSHIFT
User ID? awsuser
Password? Password
>

Using odbctest

1. Go to <Campaign Home>/bin directory from the command prompt from the Server on which the Campaign listener is installed.

2. Run the setenv.sh/setev.bat.

3. Run odbctest utility.

4. Provide the Server Name, UserName, and Password as per your configuration. If you see a prompt “>,” it indicates that you are connected to the database successfully.

[root@server bin]# ./odbctest
Registered Data Sources:
    AMAZONREDSHIFT (/opt/amazon/redshiftodbc/lib/64/libamazonredshiftodbc64.so)
Server Name? AMAZONREDSHIFT
User ID? awsuser
Password? Password
Detected Data Direct compatibility
Server AMAZONREDSHIFT conforms to LEVEL 2.
Server's cursor commit behavior: PRESERVE
Transactions supported: ALL
Maximum number of concurrent statements: 1
For a list of tables, use PRINT.
>

You can also test the connection using the command “isql -v AMAZONREDSHIFT” on a non-Windows server. You test the connectivity directly from the “ODBC Datasource Administrator (64-bit) on Windows OS.”

To configure Amazon Redshift Loader (optional step)

By default, Unica makes use of BULK INSERT to load a huge amount of data. In case you want to leverage better performance than BULK INSERT, you can very well make use of amazon redshift loader using the “COPY” command implementation. To understand how the loader works at the Amazon Redshift database side, you can refer to the tutorial.

Configuration related to the loader at AWS side

You need to contact AWS Support and get the below steps done.

1. Create an S3 bucket on the AWS side.

2. Gather AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY to connect to your AWS database.

3. Install the AWS Cli utility on the server on which Unica Campaign Listener is installed.

4.  You need to make the required configuration to connect to the S3 bucket using “aws configure” command.

Once the above steps are done, you would need to test below directly from the command prompt:

For copying data file to S3 bucket (Here, you need to provide actual datafile path for <DATAFILE>)

aws s3 cp <DATAFILE> s3://s3bucketaws/<DATAFILE Name>

For loading data to the table, connect to the database from any tool, and run the below command.

copy <TABLE> from 's3://s3bucketaws/<DATAFILE Name>' credentials 'aws_access_key_id=<Your-Access-Key-ID>;aws_secret_access_key=<Your-Secret-Access-Key>' csv;

Once you are able to test the above commands successfully directly from the command prompt (not from Unica), you can proceed to make the necessary configuration at the Unica side. If in case, you face any issue in the testing above mentioned steps, you should reach out to AWS support and make sure the above testing is successful.

Now, you need to create a shell script that accepts two arguments – DATAFILE PATH and TABLENAME. Ensure that this script runs successfully and can load the data to the table when executed from the command prompt directly. Here is an example of a Sample loader script

DATAFILE=$1
TABLE_NM=$2
export S3BUCKET=[Change me]
export AWS_ACCESS_KEY_ID=[Change me]
export AWS_SECRET_ACCESS_KEY=[Change me]
export DSNNAME="AMAZONREDSHIFT" #Change this value as per your odbc.ini
ERR_CD=1
LOG_FILE="/tmp/log.$$"
FILE_NM=`basename $1`
S3_FILE=$S3BUCKET$FILE_NM
echo "file to copy is $1"
echo aws s3 cp $1 $S3BUCKET >> $LOG_FILE 2>&1
aws s3 cp $1 $S3BUCKET >> $LOG_FILE 2>&1
RESULT=$?
if [ ${RESULT} -ne 0 ]; then
echo "ERROR in aws s3 cp" >> $LOG_FILE
exit $ERR_CD
fi
COMMAND="COPY $TABLE_NM FROM '$S3_FILE' CREDENTIALS 'aws_access_key_id=$AWS_ACCESS_KEY_ID;aws_secret_access_key=$AWS_SECRET_ACCESS_KEY' csv"
echo $COMMAND > /tmp/sql.$$
isql $DSNNAME < /tmp/sql.$$
RESULT=$?
echo "RESULT is $RESULT"
if [ ${RESULT} -ne 0 ]; then
echo "ERROR in COPY" >> $LOG_FILE
exit $ERR_CD
fi
# remove file from s3?
aws s3 rm $S3_FILE
echo "LOG_FILE is $LOG_FILE"
exit 0

Configuration at HCL Unica side

Keep the loader script under <Campaign Partition Home>/scripts folder. Add the following lines to the Data Source section of the Configuration in Unica Marketing Platform that pertains to the specific Amazon Redshift data source i.e., “Affinium|Campaign|partitions|partition1|dataSources|<REDSHIFT Datasource>.”

loaderCommand = /opt/Unica/campaign/partitions/partition1/scripts/amazonload.sh <DATAFILE> <TABLENAME>
loaderCommandForAppend = /opt/Unica/campaign/partitions/partition1/scripts/amazonload.sh <DATAFILE> <TABLENAME>
loaderDelimiter = ,
loaderDelimiterForAppend = ,

HCL Unica Campaign with Amazon Redshift Database

To enable ODBC Trace.

Sometimes, it is required to enable the ODBC Trace level logging if you want to troubleshoot any issue. To enable the trace level logging, you need to update the file “amazon.redshiftodbc.ini” which can be found under /opt/amazon/redshiftodbc/lib/64 as mentioned below.

[root@server 64]# cat amazon.redshiftodbc.ini
[Driver]
## - DriverManagerEncoding is detected automatically.
##  Add DriverManagerEncoding entry if there is a need to specify.
ErrorMessagesPath=/opt/Campaign/redshift_odbc_logs
LogLevel=6
LogPath=/opt/Campaign/redshift_odbc_logs
SwapFilePath=/tmp

The LogLevel=6 indicates that Trace level logging is enabled. You can specify your desired folder location for LogPath and ErrorMessagesPath. To disable the TRACE level logging, you would need to update LogLevel to 0. You can enable the Trace level logging from the “ODBC Datasource Administrator (64-bit) on Windows OS.”

There are a few known issues.

1. Amazon ODBC Driver v1.4.3.1000 does not support BULK INSERT on the Amazon Redshift Database. This is a limitation of this version of the driver.

2. With Amazon ODBC Driver v1.4.11.1000, If you are exporting Campaign generated fields like Flowchartname, Cellcode, etc. in outbound process boxes like Snapshot, Extract, etc., the datatype of such fields are considered as BOOL and Flowchart execution fails. To fix this issue, you would need to add “BoolsAsChar=0” in your odbc.ini file.

3. If you are using PostGreSQL v9.6.5 to connect to Amazon Redshift Database, the count of inserted/updated records being displayed on Process Box would not match with the actual affected count. This is a known issue with PostGreSQL v9.6.5. It is recommended to use v9.03.0100 with Unica to avoid the wrong count display issue.

4. PostGreSQL driver (all versions) does not support BULK INSERT on the Amazon Redshift Database. This is a limitation of this driver.

With Amazon Redshift, it is easy to scale, and even with thousands of concurrent queries running, it gels well with Unica Campaign, providing consistently fast performance. To learn more about Unica Campaign integration with Amazon Redshift, you can reach out to us.

Comment wrap
Marketing & Commerce | November 30, 2021
Unica 12.1.2 Marketing Platform Winter Release
Unica 12.1.2 is a cloud native-as-a-service offering, HCL Unica on HCL Now, and powered by the HCL SoFy, the Solutions Factory for HCLSoftware.