Hive Connection String

Hive does not support Kerberos security. d) In the. I have the. The pyodbc tests allow you to ensure that an ODBC driver is compatible with pyodbc. After struggling for 2 days, finally I figured out how to connect. For example, talend-hdp-all:50300. field, type the number of rows to be fetched per block. You can create and manage a Hive connection in the Administrator tool, Analyst tool, or the Developer tool. On SQL Server 2005, the Browser component is disabled by default. Locate “JDBC” and click on “Make New Connection”. This description includes how to use the Connection Manager feature of Oracle Database, or signed applets if you are connecting to a database that is running on a different host from the Web server. ConnectorExportTool \ -D mapreduce. CachingKeyProvider. This is the answer to LYT's issue. JDBC in Applets. It provides an SQL-like interface called HiveQL to access distributed data stored in various databases and file systems. If you want to use a connection string for connecting to a database, or if your application requires it, you must specify either a DSN (data source name), a File DSN, or a DSN-less connection in the string. Make sure you have stored the value in HKEY_LOCAL_MACHINE hive only. Replace dt with your column name. This section describes what you must do to connect an applet to a database. Using an ODBC source for authentication can be critical to some implementation plans; however, actually finding the correct strings and syntax to use can be challenging. I am working on Hadoop Environment and Hive Database. Use the Hive connection to access Hive data. For more information on the Hive JDBC Interface, see HiveJDBCInterface. In this post, we will discuss about one of common hive clients, JDBC client for both HiveServer1 (Thrift Server) and HiveServer2. Connection String for HIVE - Hadoop Environment. For more information about Hive data types, see Dates on the Apache Hive website. But I experience some problems to start a connection with HDFS Connector or HIVE Connector. For example in the below beeline-hs2-connection. The JDBC Connection string also contains parameter ServicePrincipalName which should be set to hive principal name, in format hive/[email protected]_realm. I have the. To work around this issue, I selected the 2 nd option (Use Connection string) in the Connection tab of the Data Link Properties, as shown in Figure 8. So, I configured…. Once you have created a connection to a MongoDB database, you can select data from the available tables and then load that data into your app or. Hive Database Connection Problem @Henrik Matzen Looking at your connection string you are using a SSL/TLS connection,. However, as per this official article, it supports to use Import or DirectQuery mode to connect to imphala in Power BI Desktop, but based on this sentence" Future plans include enabling refresh support using the Power BI Gateway", it seems that there is limitation about refresh. For example: [apitest] connection-string=DSN=MSSQL-PYTHON Run the tests: $ python setup. Type a friendly name for the connection. "tHiveConnection" will be used to configure the connection to Hive. 0 and later. But do know the username and password. Once you have created a connection to an Apache Hive database, you can select data from the available tables and then load that data into your app or document. JDBC in Applets. An Event Hubs connection string is required to connect to the Event Hubs service. I'm not a Kerberos expert by any means, but I'm pretty sure KrbHostFQDN needs to be the actual fully qualified domain name of the host that HiveServer 2 is running on. Using Hive¶ Hive is an open source data warehouse project for queries and data analysis. Using Cloudera Data Warehouse, you can transform and optimize your current traditional data warehouse by moving select workloads to your CDH cluster. This allows the use of any DBMS in R through the JDBC interface. If you set the MSGLEVEL=i system option and re-run a failed request, the data source connection strings that were generated by the procedure are written to the SAS log. Pi ODBC Connection String & Power BI I can connect to the database fine on the desktop version using a pre-configured connection, but when connecting using the. getConnection() method while supplying a connection URL. Hive connection properties are case sensitive unless otherwise noted. In this post, we will discuss about one of common hive clients, JDBC client for both HiveServer1 (Thrift Server) and HiveServer2. 0 datasource using Hive 0. In Connection strings section, you can setup all the connection strings: You can also choose the connection type: These settings are injected at run time. Use the connection string format required by the database. Engine Configuration¶. Microsoft has deprecated the Hive plug-in for Excel (which is a little disappointing—it was never production ready, but it did offer some really nice functionality). string the full ODBC connection string. The Spark SQL Thrift JDBC server is designed to be “out of the box” compatible with existing Hive installations. Easily Build BI Applications with Open Source, Interactive SQL. Remote mode: Create a JDBC connection string for making an unauthenticated connection to the Hive default database on the localhost port 10000. For others who like details, please review the steps below -. If you are using an external metastore and you have a large number of clusters running, then increase the connection limit on your external metastore. But test jdbc connection to hive using the right hive username and password, it fail and shows "Could not. Just like the Database Connection Dialog Box, you can save a connection or work with an existing connection. AD Kerberized cluster Hive connection string. The following code retrieves the first Apache Hive connection string defined in the application configuration file. To work around this issue, I selected the 2 nd option (Use Connection string) in the Connection tab of the Data Link Properties, as shown in Figure 8. Presto can be accessed from Java using the JDBC driver. To access your data stored on a MongoDB database, you will need to know the server and database name that you want to connect to, and you must have access credentials. Use of HiveServer2 is recommended as HiveServer1 has several concurrency issues and lacks some features available in HiveServer2. Enter values for authentication credentials and other properties required to connect to Hive. Connecting BI Tools. There are many different types of Hive setups such as ones from HortonWorks, Cloudera, Amazon EMR, and the base Apache Hive setup. GitHub Gist: instantly share code, notes, and snippets. Using Hive¶ Hive is an open source data warehouse project for queries and data analysis. You can write Java program to execute Hive query and assign the result to variable. TO IMPORT & EXPORT DATA FROM RDBMS (MYSQL,ORACLE, etc) INTO HDFS / HIVE / HBASE Pre-requisite Apache Hadoop Apache Sqoop (compatible with Hadoop version) Apache Hive (optional) Apache HBase (optional) Apache HCatalog (optional) JDBC/ODBC connector For all RDBMS, Connection URL changes and remaining all command line arguments remains same. Data from Solr can be presented as a Hive table to be joined with other Hive tables, and data. With this installation a new DSN with the name "Sample Microsoft Hive DSN" was created automatically. I'm not a Kerberos expert by any means, but I'm pretty sure KrbHostFQDN needs to be the actual fully qualified domain name of the host that HiveServer 2 is running on. I am not familiar with the difference between cloudera hive and imphala. For example: I have a default. The ConnectionStringBuilder class from Syncfusion. 2 beta) to connect to Apache Hive? How do I configure the connection to use kerberos authentication and/or Apache Knox?. (KGET) — Kern County Sheriff's Office Rural Crime investigators say they cracked a case of a string of suspected bee hive heists. Please refer below table to convert any date format into fixed format i. Go to Options > Advanced Options > Manage Data Connections and select Add Connection > Other. 13 from IBM Impala from the drop down menu at the top and put in the JDBC connection string, similar. Connection reset. used in the images and examples below represent a fictitious sample for clarity or are default installation values. TIBCO Spotfire® connects to virtually any JDBC compliant data source via the Spotfire Server Information Services interface. In these cases, Sqoop will emit a warning in its log. Hive users will note that there is not a one-to-one mapping between SQL types and Hive types. On Windows systems, it can also connect via ODBC. Port — Enter the listening port for the Hive service. Use the connection string format required by the database. Installation and Configuration Guide. In order to check the connection between Spark SQL and Hive metastore, the verification of the list of Hive databases and tables using Hive prompt could be done. If you are already using JDBC applications with an earlier Impala release, you should update your JDBC driver, because the Hive 0. The data source is a location of your data (it can be a server, CSV, or a DDL file). In case of properties which are present in both beeline-hs2-connection. For more information on the Hive JDBC Interface, see HiveJDBCInterface. String functions are classified as those primarily accepting or returning STRING, VARCHAR, or CHAR data types, for example to measure the length of a string or concatenate two strings together. Configure an ODBC Connection. If you use the Hive connection to run mappings in the Hadoop cluster, the Data Integration service executes only the environment SQL of the Hive connection. 1 - Connection String for Datasource is not in JDBC Format. In the Properties dialog, go to the Hive JDBC Connection Properties tab. The JDBC Connection string also contains parameter ServicePrincipalName which should be set to hive principal name, in format hive/[email protected]_realm. Toad allows you to connect to a Hive data warehouse system. All the configurations will be injected and can accessed through IConfiguration. Connection String Examples. The Hive Query Language is a subset of SQL-92. Cloudera has been named as a Strong Performer in the Forrester Wave for Streaming Analytics, Q3 2019. Saperator – Saperator used while creating Hive Table. In this article, we will check steps to Connect HiveServer2 from Python using Hive JDBC Drivers. Connection string. Here's an example of a format that can be use:. First would be to create the JSON file containing the connection strings. I want to import data from text file to hive table using scoop. In creating a JDBC connection to Hive, SAS/ACCESS places the user ID and password that you provided in the JDBC connection string. In Qlik Sense, you connect to an Apache Hive database through the Add data dialog or the Data load editor. Deploying in Existing Hive Warehouses. Use the Hive connection to access Hive data. jar and add it to the class path of your Java application. JDBC Connection String: Connection string used to connect to the database. Hive Connection with Talend Open Studio Example Talend Open studio for big data provides following components to integrate with hive. 13 JDBC driver. ODBC connections in Powershell 4. During this guide you learned how to load data into a Hive table using a PDI job. Oracle SQL Connector for HDFS uses external tables to provide Oracle Database with read access to Hive tables, and to delimited text files and Data Pump files in HDFS. Figure 8: Hive ODBC Connections. Configuration properties can be set in one of the following ways:. There are instances when it is necessary to connect to a MySQL database containing sensitive information. Locate “JDBC” and click on “Make New Connection”. Connection Notes. Steps to Connect HiveServer2 from Python using Hive JDBC Drivers. These two articles may help what you are looking for How to Access Hive via Python? Python + JDBC = Dynamic Hive scripting. Create Hive Connections. when exporting 2billion+ records into teradata from hadoop using TDCH (Teradata Connector for Hadoop) using the below command with "batch. Using Smart Data Access (SDA) with HADOOP seems to me a great idea for balancing the strengths of both tools. In the Name node field, enter the location of the master node, the NameNode, of the distribution to be used. After the connection is complete, connection details appear at the top of the SQL Workbench/J window. Tech Sign In Page Forgot Password? Don't have an account? Sign up. In Connection strings section, you can setup all the connection strings: You can also choose the connection type: These settings are injected at run time. There may be one or more arguments. Exporting Hadoop path in Hive-config. This chapter explains how to use the SELECT statement with WHERE clause. The difference is whether you use the DSN=, FILEDSN=, or the DRIVER= keyword in the connection string, as described in the ODBC specification. 2 or prior, please update the same keys in the following hive too: - HKEY_LOCAL_MACHINE\SOFTWARE\Pragmatic Software 4. e yyyy-MM-dd. Solved: I am trying to connect to hive through a java program using the below connection string. RazorSQL can connect to Apache Hive databases via JDBC on Windows, Mac, and Linux operating systems. Unfortunately for real-time responsiveness HIVE SQL currently isn’t the most optimal tool in HADOOP [instead it’s better used for batched SQL commands]. Use the Hive connection to access Hive data. If the Hive sources and targets are on different clusters, the Data Integration Service does not execute the different environment SQL commands for the connections of the Hive source or target. I have some question about the hive jdbc connection. Im curious what JDBC Driver is needed to make the Hive connection work?. LIBNAME Statement for Relational DatabasesFor general information about this feature, see LIBNAME Statement for Relational Databases. When a Spark StringType column has maxLength metadata, it is converted to a Hive Varchar column; otherwise, it is converted to a Hive String column. JDBC connection string. How to connect to Cloudera Impala 2. Last October 28th Microsoft finally released a new Azure service called "Windows Azure HDInsight Service", that is our Hadoop offering in the Cloud: Announcing Windows Azure HDInsight: Where big data meets the cloud Along with this core service, a series of additional components has been also released to integrate Big Data world with Microsoft BI. In Hive command line mode or a script I can issue "use database" followed by a query (example below). Returns a nonbinary string, if all arguments are nonbinary strings. Hi Yerram Reddy,. The Sisense ODBC Driver with SQL Connector for Apache Hive enables standard SQL-92 access directly to Apache Hive distributions. Host — Enter the hostname or IP of the server running HiveServer1 or HiveServer2. ODBC connections in Powershell 4. The Hive ODBC driver was developed with Thrift trunk version r790732, but the latest revision should also be fine. What happens if a failover occurs with the default connection string? If the connection string is the default one, on the event of a failover the jdbc connector has no idea there is a shadow master that can take over. 0 and later, you can use the Hive 0. Spotfire Information Services requires a Data Source Template to configure the URL Connection string, the JDBC driver class, and other settings. In the Connection area, enter the connection parameters to the Hive database to be used. 0 datasource using Hive 0. Check the JDBC connection string setup in the hive connector stage and ensure the port is the valid THRIFT port for the hive server. Use of HiveServer2 is recommended as HiveServer1 has several concurrency issues and lacks some features available in HiveServer2. Tip: Connections are stored in the connections. The referenced connection ID is the same connection ID of the hive connection from step #2 we used the JDBC connection string appendix of `;. Presto can be accessed from Java using the JDBC driver. zip), and add the compressed file to the driver class path. After downloading and unzipping the driver, you need to add all files to a classpath in DQA. The pyodbc tests allow you to ensure that an ODBC driver is compatible with pyodbc. CData ODBC Driver for Apache Hive 2019 - RSBHive - CData ODBC Driver for Apache Hive 2019: The connection string properties describe the various options that can be used to establish a connection. These two articles may help what you are looking for How to Access Hive via Python? Python + JDBC = Dynamic Hive scripting. Connecting BI Tools. Just make sure to substitute the connection string, username and password with your own. Typically, after installing the Simba Hive ODBC Driver, you need to create a Data Source Name (DSN). 1, the Thrift Server is compiled against Hive 0. Once you have a rough idea what your connection string will look like you can use the following lines to create a connection object from that string. A Hive connection is a database type connection. With this steps my roles can now access the Hive database without adapting my connection string. #! Note that in python you specify a tuple with. This post explains how to work with ODBC connections in Powershell. It's "home base" for the actual database and its DBAPI, delivered to the SQLAlchemy application through a connection pool and a Dialect, which describes how to talk to a specific kind of database/DBAPI combination. Use the connection string format required by the database. Use of HiveServer2 is recommended as HiveServer1 has several concurrency issues and lacks some features available in HiveServer2. x releases that was created in an earlier version and has date/time data stored as a string in a format that Hive doesn't support. id a numeric ID for the channel. rows_at_time the value of rows_at_time. ConnectionStrings. Page7 WindowsDriver InstallingtheDriveronWindows On64-bitWindowsoperatingsystems,youcanexecuteboth32-and64-bitapplications. In the job design canvas, double-click the Hive Connector stage icon to open the stage editor. (BI), analytics and reporting on Hadoop / Hive-based data. Once you have a rough idea what your connection string will look like you can use the following lines to create a connection object from that string. The sample shown below is a connection to the MS Access 2010 version of the Adventureworks database. Hi Can I use the latest version of Oracle SQL developer (or v4. An administrator can choose to use a DSN, or for even better performance ideal for large numbers of users, a DSN-less (direct) connection can be formed. All the columns have the string, character varying data-type for Hive, Impala, Spark and Drill. A NULL value is returned when you open a workbook in Tableau 9. MicroStrategy Simba Hive Driver couldn't be loaded on RHEL 72. Questions: I want to load Java classes dinamically on Android application. 0 and later, you can use the Hive 0. config file. This section describes the Hive SerDe and how to use it. You can get the connection string for your Event Hubs instance from the Azure portal or by using the ConnectionStringBuilder in the library. $ az iot hub show-connection-string — hub-name {YourIoTHubName} — output table Call the direct method The back-end application connects to a service-side endpoint on your IoT Hub. ConnectionStrings. In the Connection Manager, right-click the Impala connection and select Properties. But when i create a connection. The Cloudera ODBC Driver for Hive enables your enterprise users to access Hadoop data through Business Intelligence (BI) applications with ODBC support. The following configuration parameters must be set appropriately in hive-site. I have some question about the hive jdbc connection. Using Smart Data Access (SDA) with HADOOP seems to me a great idea for balancing the strengths of both tools. To create a Hive connection. The Hive JDBC driver allows you to access Hive from a Java program that you write, or a Business Intelligence or similar application that uses JDBC to communicate with database products. 0” and your source will be the physical location of the. SAS/ACCESS can access Hadoop data through a JDBC connection to a HiveServer2 service. Postgresql Connection String Default Schema Multiple connections / connection objects (Does not force the use of a singleton) the connection string we will use to connect print "Connecting to database/n -_%s" in the public schema and the table you are selecting from is named "my_table". Basically, it describes the interaction of various drivers of climate like ocean, sun, atmosphere, etc. It's deployed successfully and able to see result in web App. Hi Can I use the latest version of Oracle SQL developer (or v4. All the columns have the string, character varying data-type for Hive, Impala, Spark and Drill. 2 KB; Introduction. I've succesfully established connection to Impala with the following driver and connection string:. Remote mode: Create a JDBC connection string for making an unauthenticated connection to the Hive default database on the localhost port 10000. 0 and later, you can use the Hive 0. With the Hive driver defined, all that's left is to define the connection string. jar Fill in the connection properties and copy the connection string to the clipboard. Spotfire Information Services requires a Data Source Template to configure the URL Connection string, the JDBC driver class, and other settings. First of all I need to have a connection string to my instance of Hive and I can build it very easily in two ways:. # JDBC connect string for a JDBC metastore spark. Unfortunately for real-time responsiveness HIVE SQL currently isn’t the most optimal tool in HADOOP [instead it’s better used for batched SQL commands]. A DSN is a data structure that stores connection information so that it can be used by the driver to connect to Hive. Some databases, such as PostgreSQL, require the schema in the connection string. For details on how to install and use ODBC to connect to Hive, see Hive ODBC Connector. ConnectionURL # Username to use against metastore database spark. Solved: I am trying to connect to hive through a java program using the below connection string. You can create and manage a Hive connection in the Administrator tool, Analyst tool, or the Developer tool. The MQTT connection is always between one client and the broker. pbix file using Power BI desktop application for Hive database (using odbc connection). See also, About Apache Hive Data Sources. Configuration properties can be set in one of the following ways:. But that setting is recommended just for the testing or ad-hoc development purpose. Use the connection string format required by the database. It's deployed successfully and able to see result in web App. Select Others (OLEDB/ODBC) since we will be using the ODBC Driver for Hive to access Hive on Azure, and click Next. zip), and add the compressed file to the driver class path. Net Connection Strings in when you don’t have the all-important Web. Keep this value empty if the conn string is in CFConnectionStrings. In this case you need to enter your password every time a connection to the database needs to be established. 3M2 - SAS/ACCESS Interface to Hadoop - only supports Hive. In this article, we will check steps to Connect HiveServer2 from Python using Hive JDBC Drivers. field, type the number of rows to be fetched per block. Make sure you have stored the value in HKEY_LOCAL_MACHINE hive only. 12 driver that was formerly the only choice is not compatible with Impala 2. Another way from shell script is not straight forward. Apache Hive Metastore is normally configured with Derby Database. This feature acts as a NOOP command as it is used to verify a connection can be made to Hive and nothing from this initial connection is maintained. But I dont know the jdbc connection string to connect to hive. On SQL Server 2005, the Browser component is disabled by default. Creates the hive user and the Hive metastore's database. It's throwing. 4 (it shipped this week). The newly created view, based on the linked server connection to hive, can now be referenced in an SSAS project just like any other table or view. Because it is the nature of Web content to change frequently, Progress DataDirect can guarantee only that the URLs referenced in this book were correct at the time of publication. If you have already used Hiveserver2 using JDBC for an on-premise Hadoop cluster, you can skip the TLDR part below :-), you can just review the code sample below and see the difference in Hive JDBC connection string for Azure HDInsight. Hive connection properties are case sensitive unless otherwise noted. Structure can be projected onto data already in storage. These properties and values are the same ones used to connect to your Hiveserver2 using beeline. If yes , please post your connection string on this Portal. Setting Up a Hive Connection with Kerberos and LDAP using Cloudera JDBC Drivers (Windows) Follow. All the columns have the string, character varying data-type for Hive, Impala, Spark and Drill. Embedded mode: Create the JDBC connection string for connecting to Hive in embedded mode. Rows Fetched Per Block. field, type the number of rows to be fetched per block. In Impala 2. For example in the below beeline-hs2-connection. Update the Hive database objects in DATABASE. config Example: value="SOFTWARE\Axero"--> The connection string in registry can be in different formats, but Communifire supports only these two most common. Typically, after installing the Simba Hive ODBC Driver, you need to create a Data Source Name (DSN). Note: Any positive 32-bit integer is a valid value but testing has shown that performance. jar username (String) The connection username to be passed to our JDBC driver to establish a connection. KMS caches encryption keys for a short period of time to avoid excessive hits to the underlying KeyProvider. Input column name: dt (String). Oracle SQL Connector for HDFS uses external tables to provide Oracle Database with read access to Hive tables, and to delimited text files and Data Pump files in HDFS. Secure your Hadoop Cluster. This gives you more flexibility in configuring the thrift server and using different properties than defined in the spark-defaults. Presto can be accessed from Java using the JDBC driver. Hive tables. When you click the Connect to Hadoop Hive link on the Amp Home page, the Connect to Hive dialog box opens with specific connection items for Hive. FAILED: Execution Error, return code 1 from org. Unless otherwise noted, properties can be set for a DataSource object or for a Connection object. Microsoft® Hive ODBC Driver enables Business Intelligence, Analytics and Reporting on data in Apache Hive. I've succesfully established connection to Impala with the following driver and connection string:. I'd also check with your IT. The Snowflake ODBC driver utilizes both configuration and connection parameters. The following is an example connection string for the Direct connection type:. Rapid SQL is currently unable to create table names, column names, and check constraint names longer than 18 characters on an OS390 mainframe database. It should ideally be part of SessionConf section, as that is where rest of the connection parameters are. This description includes how to use the Connection Manager feature of Oracle Database, or signed applets if you are connecting to a database that is running on a different host from the Web server. Description — Enter an optional description for the DSN. The Hive ODBC driver was developed with Thrift trunk version r790732, but the latest revision should also be fine. The Hive Query Language is a subset of SQL-92. When you pass in Hive and Oozie Metastore information to ClusterCreateParamters objects it always complain about not able to connect to the Azure SQL Server. The methods for setting the parameters are different depending on the environment in which the driver is installed. PDS to use CatalogNameSpaceSQL. Compatibility with Apache Hive. Input column name: dt (String). This is to help speed up the moderation process and to show how the model and/or texture looks like from the in-game camera. Smart Bear – Soap UI & Jmeter connection to Kerberos enabled HIVE January 14, 2019 January 16, 2019 ~ Anoop Kumar K M SoapUI and Jmeter are testing tools which most of the testers use. For example, if there are three unique drillbits listed in the connection string, and the “tries” value is set to 5, the client can try to connect to each drillbit once, until a successful connection is made, as shown in the image below: If the client cannot successfully connect to any of the drillbits, Drill returns a failure message. queuename= \ -libjars ${LIB_JARS} \ -classname com. The key piece of configuration here is the JDBC Connection String. 1, the Thrift Server is compiled against Hive 0. keytab file which authenticates me. After the connection is complete, connection details appear at the top of the SQL Workbench/J window. For more information on the Hive JDBC Interface, see HiveJDBCInterface. Oracle SQL Connector for HDFS uses external tables to provide Oracle Database with read access to Hive tables, and to delimited text files and Data Pump files in HDFS. Third, the Hive CLI command could be deprecated in the future, this is discussed within the Hive Developer Community. Use HiveQueryManager class object to Submit Hive and SQOOP jobs. Keep this value empty if the conn string is in CFConnectionStrings. The setup package generally installs about 36 files and is usually about 8. ; On the Properties page, specify values for the connection properties. Powershell 4 introduced new cmdlets that make it very easy to create and manage ODBC connections. For tuning suggestions for the thrift server, refer to the blog post How to: Run Queries on Spark SQL using JDBC via Thrift Server. If establishment of the connection fails, another server in the connect string will be tried (the order is non-deterministic, as we random shuffle the list), until a connection is established. And could you please clarify - are you using Kerberos authentication? If you are, the kinit tool will allow you to create a ticket and connect to the Hive locally. F rom the Hive server type drop-down, choose if you want the connector to autodetect the type of server to connect to or connect to a Hive1 or Hive2 server. sql: string or SQLAlchemy Selectable (select or text object) SQL query to be executed or a table name. Typically, after installing the Simba Hive ODBC Driver, you need to create a Data Source Name (DSN). Returns a nonbinary string, if all arguments are nonbinary strings. It supports data queries, analysis, and summarization. what is the hive connection url string in BDD 1. In main console, click MySQL connections (+) sign à new connection window and provide connection parameter Name, hostname – RDS endpoint, port, username and password. The key piece of configuration here is the JDBC Connection String. Adding ODBC Connections for Hadoop. This gives you more flexibility in configuring the thrift server and using different properties than defined in the spark-defaults. If the test is successful click 'OK' to close the Database Connection window. This property enables the 'Test Connection' feature available in various database tools. Microsoft® Hive ODBC Driver enables Business Intelligence, Analytics and Reporting on data in Apache Hive. Spark SQL is developed as part of Apache Spark. Dont think you can do it completely with Hive Script. Next check whether you can reach the servers via your C# code using the appropriate connection strings. You can create and manage a Hive connection in the Administrator tool, Analyst tool, or the Developer tool. Once all the Hadoop daemons are working fine, just start Hive installation part.