prakhar

Using HQL for Data Storage

Blog Post created by prakhar Employee on Mar 1, 2017

Suppose you have hql file which have hive table create and insert statement to insert data into the hive table.

In Java the an array can be indexed only via an integer which means the highest value the inces of the array can be Integer.MAX_VALUE which will cause the out of memory .

Hive can't execute this large hql files. The data itself should be provided in any data format like text, csv, Parquet etc.

But hql contain only DDL query, not all the data. HQL is not a data storage format.


As a work around you can

set HADOOP_HEAPSIZE to 4GB in /opt/mapr/hive/hive-1.2/conf/hive-env.sh or  

add

export HADOOP_HEAPSIZE=4096 in /opt/mapr/hive/hive-1.2/conf/hive-env.sh

The work around might fail for large hql file.

Its not recommended to use HQL for storing data specially in case of huge volume

Outcomes