How to arhive hive external tables?

Hi Guys,

Is there a way to check hive external tables which are created 90 days before and drop those tables along with underlying hdfs data. Can this be achieved in unix script?

You might consider looking at the Hive LanguageManual

https://cwiki.apache.org/confluence/display/Hive/LanguageManual

There you will see, for example, the Beeline - Command Line Shell.

https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-Beeline%E2%80%93CommandLineShell

You should be able to use this type of Hive CLI in shell scripts to work with Hive, as needed.

Or, you might consider other Hive CLIs, like

HCatalog CLI

Read the Hives CLI docs and experiment until you find a CLI that suits your needs.

I am using Hive CLI but how can i able to check 90 days created

desc formatted my_table;

You basically have to write a Hive SQL query to get the information you are seeking.

That's part of the Hive documentation.

How you can get using sql, I need to check the hive tables which are created 90 days before and drop those tables

Execute the CLI command

desc formatted <database>.<table_name> 

on the hive cli.

It will show detailed table information with the creation time.

Yes i did that but i am struck to get the created date and location, so that i can able to use 90 days logic. Can you advise how this can be done using scripting

Just take the hive cli command you want and execute in silent mode similar to this example below but use your own cli commands and logic:

 $hive -S -e 'select col from tab1' > a.txt

Experiment and learn.

I tried something like this but not able to calculate the days out of this

hive -e 'desc formatted db.table_name' | grep CreateTime

Why did you not post the command and the output?

We cannot help you if you plan to operate here in secret and not posting your code and the output fully.

Sorry to tell you. We cannot help you if you are not going to post key the input/output information required since most the users here are not using Hive.