sql server 导出数据到 Azure Hbase / Hive 详细步骤

本文涉及的产品
云数据库 RDS SQL Server,独享型 2核4GB
简介: The Hadoop on Azure Sqoop Import Sample Tutorial Table of Contents   Overview   Goals Key technologies Setup and Configuration   Tut...

The Hadoop on Azure Sqoop Import Sample Tutorial


  

Overview

This tutorial shows how to use Sqoop to import data from a SQL database on Windows Azure to an Hadoop on Azure HDFS cluster.

While Hadoop is a natural choice for processing unstructured and semi-structured data, such as logs and files, there may also be a need to process structured data stored in relational databases.

Sqoop is a tool designed to transfer data between Hadoop and relational databases. You can use it to import data from a relational database management system (RDBMS) such as SQL or MySQL or Oracle into the Hadoop Distributed File System (HDFS), transform the data in Hadoop with MapReduce or Hive, and then export the data back into an RDBMS. In this tutorial, you are using a SQL Database for your relational database.

Sqoop is an open source software product of Cloudera, Inc. Software development for Sqoop has recently moved from gitHub to the Apache Sqoop  site.

In Hadoop on Azure, Sqoop is deployed from the Hadoop Command Shell on the head node of the HDFS cluster. You use the Remote Desktop feature available in the Hadoop on Azure portal to access the head node of the cluster for this deployment.

Goals

In this tutorial, you see three things:

  1. How to set up a SQL database on Windows Azure for use with the tutorial.

  2. How to use the Remote Desktop feature in Hadoop on Azure to access the head node of the HDFS cluster.

  3. How to import relational data from SQL Server to a Hadoop on Azure HDFS cluster by using Sqoop.

Key technologies

Setup and Configuration

You must have an account to access Hadoop on Azure and have created a cluster to work through this tutorial. To obtain an account and create an Hadoop cluster, follow the instructions outlined in the Getting started with Microsoft Hadoop on Azure section of the Introduction to Hadoop on Azure topic.

You will also need your outward facing IP address for your current location when configuring your firewall on SQL Database. To obtain it, go to the site WhatIsMyIP and make a note of it. Later in the procedure, you also need the outward facing IP address for the head of the Hadoop cluster. You can obtain this IP address in the same way.


Tutorial

This tutorial is composed of the following segments:

  1. How to set up a SQL database.

  2. How to use Sqoop from Hadoop on Azure to import data to the HDFS cluster.

How to set up a SQL database

Log in into your Windows Azure account. To create a database server, click the Database icon in the lower left-hand corner on the page.

On the Getting Started page, click the Create a new SQL Database Server option.

Select the type of subscription (such as Pay-As-You-Go) associated with you account in the Create Server window and press Next.

Select the appropriate Region in the Create Server window and click Next.

Specify the login and password of the server-level principal of your SQL Database server and then press Next.

Press Add to specify a firewall rule that allows your current location access to SQL Database to upload the AdventureWorks database. The firewall grants access based on the originating IP address of each request. Use the IP address found with the configuration preliminaries of this tutorial for the values to add. Specify a Rule name, such as shown, but remember to use your IP address, not the one used for illustration purposes below. (You must also add the outward IP address of the head node in you Hadoop cluster. If you know it already, add it now.) Then press the Finish button.

Download the AdventureWorks2012 database onto your local machine from Recommended Downloads link on the Adventure Works for SQL Database site.

Unzip the file, open an Administrator Command Prompt, and navigate to the AdventureWorks directory inside the AdventureWorks2012ForSQLAzure folder.

Run CreateAdventureWorksForSQLAzure.cmd by typing the following:

CreateAdventureWorksForSQLAzure.cmd servername username password

For example, if the assigned SQL Database server is named b1gl33p, the administrator user name "Fred", and the password "Secret", you would type the following:

CreateAdventureWorksForSQLAzure.cmd b1gl33p.database.windows.net Fred@b1gl33p Secret

The script creates the database, installs the schema, and populates the database with sample data.

Return to the WindowsAzurePlatform portal page, click your subscription on the left-hand side (Pay-As-You-Go in the example below) and select your database (here named wq6xlbyoq0). The AventureWorks2012 should be listed in the Database Name column. Select it and press the Manage icon at the top of the page.

Enter the credentials for the SQL database when prompted and press Log on.

This opens the Web interface for the Adventure Works database on SQL Database. Press the New Query icon at the top to open the query editor.

Since Sqoop currently adds square brackets to the table name, we need to add a synonym to support two-part naming for SQL Server tables. To do so, run the following query:

CREATE SYNONYM [Sales.SalesOrderDetail] FOR Sales.SalesOrderDetail

Run the following query and review its result.

select top 200 * from [Sales.SalesOrderDetail]

How to use Sqoop from Hadoop on Azure to import SQL Database query results to the HDFS cluster in Hadoop On Azure.

From your Account page, scroll down to the Open Ports icon in the Your cluster section and click the icon to open the ODBC Server port on the head node in your cluster.

Return to your Account page, scroll down to the Your cluster section and click the Remote Desktop icon this time to open the head node in your cluster.

Select Open when prompted to open the .rdp file.

Select Connect in the Remote Desktop Connection window.

Enter your credentials for the Hadoop cluster (not your Hadoop on Azure account) into the Windows Security window and select OK.

Open Internet Explorer and go to the site WhatIsMyIP to obtain the outward facing IP address for the head node of the cluster. Return the SQL Database management page and add a firewall rule that allows your Hadoop cluster access to SQL Database. The firewall grants access based on the originating IP address of each request.

Double-click on the Hadoop Command Shell icon in the upper left hand of the Desktop to open it. Navigate to the "c:\Apps\dist\sqoop\bin" directory and run the following command:

sqoop import --connect "jdbc:sqlserver://[serverName].database.windows.net;username=[userName]@[serverName];password=[password];database=AdventureWorks2012" --table Sales.SalesOrderDetail --target-dir /data/lineitemData -m 1

So, for example, for the following values:
* server name: wq6xlbyoq0
* username: HadoopOnAzureSqoopAdmin
* password: Pa$$w0rd

The sqoop command is:

sqoop import --connect "jdbc:sqlserver://wq6xlbyoq0.database.windows.net;username=HadoopOnAzureSqoopAdmin@wq6xlbyoq0;password=Pa$$w0rd;;database=AdventureWorks2012" --table Sales.SalesOrderDetail --target-dir /data/lineitemData -m 1

Return to the Accounts page of the Hadoop on Azure portal and open the Interactive Console this time. Run the #lsr command from the JavaScript console to list the files and directories on your HDFS cluster. 

Run the #tail command to view selected results from the part-m-0000 file.

tail /user/RAdmin/data/SalesOrderDetail/part-m-00000


Summary

In this tutorial, you have seen how to use Sqoop to import data from a SQL database on Windows Azure to an Hadoop on Azure HDFS cluster.

相关实践学习
使用SQL语句管理索引
本次实验主要介绍如何在RDS-SQLServer数据库中,使用SQL语句管理索引。
SQL Server on Linux入门教程
SQL Server数据库一直只提供Windows下的版本。2016年微软宣布推出可运行在Linux系统下的SQL Server数据库,该版本目前还是早期预览版本。本课程主要介绍SQLServer On Linux的基本知识。 相关的阿里云产品:云数据库RDS SQL Server版 RDS SQL Server不仅拥有高可用架构和任意时间点的数据恢复功能,强力支撑各种企业应用,同时也包含了微软的License费用,减少额外支出。 了解产品详情: https://www.aliyun.com/product/rds/sqlserver
相关文章
|
3月前
|
SQL Java 分布式数据库
Flink CDC HBase字段类型与Flink SQL类型之间的转换
【1月更文挑战第4天】【1月更文挑战第19篇】Flink CDC HBase字段类型与Flink SQL类型之间的转换
63 1
|
4月前
|
SQL HIVE
Hive sql 执行原理
Hive sql 执行原理
43 0
|
4月前
|
SQL 调度 数据库管理
DMS执行失败的sql可以导出吗?
DMS执行失败的sql可以导出吗?
27 0
|
4月前
|
SQL 存储 大数据
【大数据技术Hadoop+Spark】Hive基础SQL语法DDL、DML、DQL讲解及演示(附SQL语句)
【大数据技术Hadoop+Spark】Hive基础SQL语法DDL、DML、DQL讲解及演示(附SQL语句)
78 0
|
4月前
|
SQL 分布式计算 数据库
【大数据技术Spark】Spark SQL操作Dataframe、读写MySQL、Hive数据库实战(附源码)
【大数据技术Spark】Spark SQL操作Dataframe、读写MySQL、Hive数据库实战(附源码)
102 0
|
1月前
|
SQL 关系型数据库 MySQL
Sqoop【付诸实践 01】Sqoop1最新版 MySQL与HDFS\Hive\HBase 核心导入导出案例分享+多个WRAN及Exception问题处理(一篇即可学会在日常工作中使用Sqoop)
【2月更文挑战第9天】Sqoop【付诸实践 01】Sqoop1最新版 MySQL与HDFS\Hive\HBase 核心导入导出案例分享+多个WRAN及Exception问题处理(一篇即可学会在日常工作中使用Sqoop)
95 7
|
1月前
|
SQL 数据可视化 Apache
阿里云数据库内核 Apache Doris 兼容 Presto、Trino、ClickHouse、Hive 等近十种 SQL 方言,助力业务平滑迁移
阿里云数据库 SelectDB 内核 Doris 的 SQL 方言转换工具, Doris SQL Convertor 致力于提供高效、稳定的 SQL 迁移解决方案,满足用户多样化的业务需求。兼容 Presto、Trino、ClickHouse、Hive 等近十种 SQL 方言,助力业务平滑迁移。
阿里云数据库内核 Apache Doris 兼容 Presto、Trino、ClickHouse、Hive 等近十种 SQL 方言,助力业务平滑迁移
|
2月前
|
SQL 消息中间件 分布式数据库
flink sql问题之连接HBase报错如何解决
Apache Flink是由Apache软件基金会开发的开源流处理框架,其核心是用Java和Scala编写的分布式流数据流引擎。本合集提供有关Apache Flink相关技术、使用技巧和最佳实践的资源。
187 0
|
3月前
|
分布式计算 分布式数据库 API
Spark与HBase的集成与数据访问
Spark与HBase的集成与数据访问
|
3月前
|
分布式数据库 Hbase
HBase的数据删除是如何进行的?
HBase的数据删除是如何进行的?
114 0