jdbc to hive external as avro in spark

My requirement is to bring the databricks avro package, connect to jdbc and convert to avro while writing to hive external table. Doing below steps:

Step1 - include databricks avro package in spark-shell

Step 2: connect to jdbc oracle table

stuck here:

the question here is how to convert to avro schema and save the schema in a location which can be used in tblproperties while creating external hive table. Should the data be written as avro data file or can be registeredastemptable