首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >Hive第二天学习内容总结Hive 第三天DDL特别注意一下,没事别删除数据DML

Hive第二天学习内容总结Hive 第三天DDL特别注意一下,没事别删除数据DML

作者头像
Albert陈凯
发布2018-04-08 15:26:06
8550
发布2018-04-08 15:26:06
举报
文章被收录于专栏:Albert陈凯Albert陈凯

Hive 第三天

[toc]

第二天内容回顾

Hive帮助文档的地址

https://cwiki.apache.org/confluence/display/Hive/Home

Hive SQL Language Manual:  Commands, CLIs, Data Types,
DDL (create/drop/alter/truncate/show/describe), Statistics (analyze), Indexes, Archiving,
DML (load/insert/update/delete/merge, import/export, explain plan),
Queries (select), Operators and UDFs, Locks, Authorization
File Formats and Compression:  RCFile, Avro, ORC, Parquet; Compression, LZO
Procedural Language:  Hive HPL/SQL
Hive Configuration Properties
Hive Clients
Hive Client (JDBC, ODBC, Thrift)
HiveServer2:  Overview, HiveServer2 Client and Beeline, Hive Metrics

DDL

https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL

database

  1. create
  2. drop
  3. alter
  4. Use

Table

Create

CREATE [TEMPORARY] [EXTERNAL] TABLE

Create Table As Select (CTAS)

CREATE [TEMPORARY] [EXTERNAL] TABLE [IF NOT EXISTS] [db_name.]table_name LIKE existing_table_or_view_name [LOCATION hdfs_path];

三种类型表

临时表:TEMPORARY

跟Hive的Session生命周期一致,Hive Client 关闭|退出 表也一起删除了 临时表的优先级比其他表高:当临时表与其他表名一致时,我们操作的是临时表 直到我们把临时表Drop掉,或者Alter掉,我们才可以操作其他表

外部表:EXTERNAL

只管理元数据,Drop表的时候,只删除原数据,HDFS上的数据,不会被删除 需要指定Location

内部表:没有修饰词

全部管理,元数据和HDFS上的数据,删除就都没了

特别注意一下,没事别删除数据

CREATE [TEMPORARY] [EXTERNAL] TABLE [IF NOT EXISTS] [db_name.]table_name    -- (Note: TEMPORARY available in Hive 0.14.0 and later)
  [(col_name data_type [COMMENT col_comment], ... [constraint_specification])]
  [COMMENT table_comment]
  [PARTITIONED BY (col_name data_type [COMMENT col_comment], ...)]
  [CLUSTERED BY (col_name, col_name, ...) [SORTED BY (col_name [ASC|DESC], ...)] INTO num_buckets BUCKETS]
  [SKEWED BY (col_name, col_name, ...)                  -- (Note: Available in Hive 0.10.0 and later)]
     ON ((col_value, col_value, ...), (col_value, col_value, ...), ...)
     [STORED AS DIRECTORIES]
  [
   [ROW FORMAT row_format] 
   [STORED AS file_format]
     | STORED BY 'storage.handler.class.name' [WITH SERDEPROPERTIES (...)]  -- (Note: Available in Hive 0.6.0 and later)
  ]
  [LOCATION hdfs_path]
  [TBLPROPERTIES (property_name=property_value, ...)]   -- (Note: Available in Hive 0.6.0 and later)
  [AS select_statement];   -- (Note: Available in Hive 0.5.0 and later; not supported for external tables)

ROW FORMAT

原始数据,用什么样的格式,加载到我们Hive表 加载到我们表里的数据,原始数据不会改变

PARTITIONED BY

对我们数据进行分区

STORED AS

数据存储的文件格式

LOCATION

存放在HDFS 上目录的位置

Drop Truncate

DML

https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DML

LOAD

LOAD DATA [LOCAL] INPATH 'filepath' [OVERWRITE] INTO TABLE tablename [PARTITION (partcol1=val1, partcol2=val2 ...)]

LOCAL本地

LOCAL和inpath组合,决定是从hdfs上读取数据,还是从客户端位置读取数据

我们加载数据的时候,实际是把一个数据文件,移动到Hive warehouse目录下面,表名的这个目录

HDFS 上 直接就挪过去了

LOCAL 是上传到临时目录,然后在移动到相应的位置

OVERWRITE

是否覆盖原有数据

如果不覆盖原有数据的话,把原来的数据,复制到hive数据目录下,就会重复了xxx_copy

PARTITION

分区,根据PARTITION (gender='male',age='35')

INSERT

into Hive tables from queries

Standard syntax:
INSERT OVERWRITE TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 ...) [IF NOT EXISTS]] select_statement1 FROM from_statement;
INSERT INTO TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 ...)] select_statement1 FROM from_statement;
 
Hive extension (multiple inserts):
FROM from_statement
INSERT OVERWRITE TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 ...) [IF NOT EXISTS]] select_statement1
[INSERT OVERWRITE TABLE tablename2 [PARTITION ... [IF NOT EXISTS]] select_statement2]
[INSERT INTO TABLE tablename2 [PARTITION ...] select_statement2] ...;
FROM from_statement
INSERT INTO TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 ...)] select_statement1
[INSERT INTO TABLE tablename2 [PARTITION ...] select_statement2]
[INSERT OVERWRITE TABLE tablename2 [PARTITION ... [IF NOT EXISTS]] select_statement2] ...;
 
Hive extension (dynamic partition inserts):
INSERT OVERWRITE TABLE tablename PARTITION (partcol1[=val1], partcol2[=val2] ...) select_statement FROM from_statement;
INSERT INTO TABLE tablename PARTITION (partcol1[=val1], partcol2[=val2] ...) select_statement FROM from_statement;
Example
FROM page_view_stg pvs
INSERT OVERWRITE TABLE page_view PARTITION(dt='2008-06-08', country)
       SELECT pvs.viewTime, pvs.userid, pvs.page_url, pvs.referrer_url, null, null, pvs.ip, pvs.cnt

into Hive tables from SQL

Standard Syntax:
INSERT INTO TABLE tablename [PARTITION (partcol1[=val1], partcol2[=val2] ...)] VALUES values_row [, values_row ...]
 
Where values_row is:
( value [, value ...] )
where a value is either null or any valid SQL literal
案例
CREATE TABLE students (name VARCHAR(64), age INT, gpa DECIMAL(3, 2))
  CLUSTERED BY (age) INTO 2 BUCKETS STORED AS ORC;
 
INSERT INTO TABLE students
  VALUES ('fred flintstone', 35, 1.28), ('barney rubble', 32, 2.32);
  
CREATE TABLE pageviews (userid VARCHAR(64), link STRING, came_from STRING)
  PARTITIONED BY (datestamp STRING) CLUSTERED BY (userid) INTO 256 BUCKETS STORED AS ORC;
 
INSERT INTO TABLE pageviews PARTITION (datestamp = '2014-09-23')
  VALUES ('jsmith', 'mail.com', 'sports.com'), ('jdoe', 'mail.com', null);
 
INSERT INTO TABLE pageviews PARTITION (datestamp)
  VALUES ('tjohnson', 'sports.com', 'finance.com', '2014-09-23'), ('tlee', 'finance.com', null, '2014-09-21');

QQ截图20171103203710.png

本文参与 腾讯云自媒体分享计划,分享自作者个人站点/博客。
原始发表:2017.11.03 ,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体分享计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • Hive 第三天
    • 第二天内容回顾
      • Hive帮助文档的地址
  • DDL
    • database
      • Table
        • Create
        • 三种类型表
        • ROW FORMAT
        • PARTITIONED BY
        • STORED AS
        • LOCATION
    • 特别注意一下,没事别删除数据
    • DML
      • LOAD
        • LOCAL本地
        • OVERWRITE
        • PARTITION
      • INSERT
        • into Hive tables from queries
        • into Hive tables from SQL
    相关产品与服务
    数据保险箱
    数据保险箱(Cloud Data Coffer Service,CDCS)为您提供更高安全系数的企业核心数据存储服务。您可以通过自定义过期天数的方法删除数据,避免误删带来的损害,还可以将数据跨地域存储,防止一些不可抗因素导致的数据丢失。数据保险箱支持通过控制台、API 等多样化方式快速简单接入,实现海量数据的存储管理。您可以使用数据保险箱对文件数据进行上传、下载,最终实现数据的安全存储和提取。
    领券
    问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档