Hi,
I am new to Spark and I would like to ask a question on the use case I am trying to work upon.
Plan is to use Hadoop/Spark as a reporting solution, fetching data from an RDBMS(Oracle) source system, perform ETL and execute report jobs using Spark SQL.
The question is, can Spark be used for interactive report requests as well? For example a user requesting a report from a web application.
Will the new Spark Structured Streaming be helpful to my case?
Or should i get the ETL Output into a structured DB for interactive reports?
Please suggest. Thanks in Advance.
Spark Tuning Performance for large dataframes 0 Answers
how to build to CI/CD pipeline to automate Spark Jobs. 3 Answers
Issues: Iterating on SparkSQL dataFrame 1 Answer
Escape option is not working while writing dataframe. 0 Answers
Save mongoDB data to parquet file format usign Apache spark 1 Answer
Databricks Inc.
160 Spear Street, 13th Floor
San Francisco, CA 94105
info@databricks.com
1-866-330-0121