0

Internal IO exception for Incremental Python Code

I am trying to setup incremental logic for a MV in Python, but getting error "INC_004028001:Internal IO exception".

Below is the code I am using, appreciate any help on this error message

 

from pyspark.sql.functions import last_day

from pyspark.sql.functions import date_format

from pyspark import *

import pyspark.sql.functions as F

 

df_table1=read("table1")

df_SHIPMENT.createOrReplaceTempView("table1")

 

df_table2=read("table2")

df_table2.createOrReplaceTempView("table2")

 

df= spark.sql("""

SELECT table1.column1,

                  table1.INSERT_DATE,

                  table1.UPDATE_DATE

FROM table1 table1

WHERE  NVL (table1.UPDATE_DATE,  table1.INSERT_DATE) >= DATE_SUB(CURRENT_DATE(),4)

UNION

SELECT table2.column1,

                  CAST(table2.Added AS TIMESTAMP) AS INSERT_DATE,

                  NULL AS UPDATE_DATE,

FROM table2 table2

WHERE CAST(table2.Added AS TIMESTAMP) >= DATE_SUB(CURRENT_DATE(),4)

""")      

save(df)

1reply Oldest first
  • Oldest first
  • Newest first
  • Active threads
  • Popular
  • I am seeing that the read() command does not have the schema attached to the table name. In fact for this you can just select the MV with sql option and paste the sql and append the schema name to the tables in the from clause.

    Like
Like Follow
  • 1 mth agoLast active
  • 1Replies
  • 16Views
  • 1 Following