so, with the help of above dependency for sql, I am able to import org.apache.spark.sql, but to import the sqlContext, I have to import "org.apache.spark.sql.api.java.sqlContext", but the very last part "sqlContext" is not present in "org.apache.spark.sql.api.java". If I put "." after api.java, I get UDF1- UDSF22. So, what would be the right maven dependencies that I am suppose to configure? I am also getting error as below:- "type SQLContext is not a member of package org.apache.spark". The error is pretty obvious, as the library that I am importing for SQLContext doesn't have the SQLcontext in it. Below is the code that I have written, its incomplete, as I am stuck in sqlcontext. package spark_sqlserver.tblcreation import java.util.Properties import java.sql.DriverManager import java.sql.Connection import org.apache.spark.SparkContext //import org.apache.spark.SparkContext._ import org.apache.spark.SparkConf //import org.apache.spark.sql.api.java. object SQLServerTbleCreate { def main (args: Array[String]){ val conf = new SparkConf() .setAppName("test SQL") .setMaster("path to Spark Master Server") .set("spark.executor.memory","1g"); val sc = new SparkContext(conf) val sqlContext = new org.apache.spark.SQLContext(sc) } }

My eclipse is configured with "C:\Program Files\Java\jre1.8.0_121\bin\server\jvm.dll" Please help, I have to complete multiple POCs based on sqlContext. I have gone through multiple article in stackflow, as well as outside. I could see people facing issues same as mine. But none of them is working for me. Regards, Amitesh Sahay