AnsweredAssumed Answered

Query Hive table using Spark SQL

Question asked by 1004140 on Apr 21, 2017
Latest reply on Apr 21, 2017 by Cathy

I have a table named orders in hive with following data:

 

OID|DATE|CUSTOMER_ID|AMOUNT
A-102|2009-10-08 00:00:00|3|3000
A-100|2009-10-08 00:00:00|3|1500
A-101|2009-11-20 00:00:00|2|1560
A-103|2008-05-20 00:00:00|4|2060

 

I can do select * from orders where OID="A-102 from hive shell and it provides me the result.

 

How can i do the same query using Spark SQL and Scala as it doesn't recognize "A-102" as a single string.

 

Any string with hyphen(A-102) or blank spaces(A 102) are not treated as a single string and only a part of the string is used in the where clause for comparison.

 

Any suggestions are appreciated.

Outcomes