01-28-2024 02:51 PM - edited 01-28-2024 02:55 PM
I am trying to refactor some tables that were previously using pyspark to use scala.
However whenever I try to read in my data I am met with an error that states "table or view not found"
My code looks something like this:
import org.apache.spark.sql.{SparkSession, DataFrame}
// Define your schema
val schema = "schema_name"
val table = "table_name"
// Read the table into a DataFrame
val df: DataFrame = spark.read
.format("parquet") // table format
.option("header", "true") // Add options as needed
.option("inferSchema", "true")
// .schema(schema) // Specify the schema here
.table(s"$schema.$table")
Trying to troubleshoot if I need to switch directories or anything I tried running
// Show all schemas in the current database
spark.sql("SHOW DATABASES").show()
which tells me that I am in a database named "default"
Running the following
// Show all tables in the current database
spark.sql("SHOW TABLES").show()
returns an empty dataframe with 3 columns named database, tableName, and isTemporary.
All 3 fields are blank
How can I go about connecting scala to the correct schema?