impcur is connect(host=”kudu3″, port=21050, database=”yingda_test”, password=None, user=’admin’, use_http_transport=True ).cursor()

I’m still getting the error:
Traceback (last US access): file "/Users/edy/src/PythonProjects/dt-center-algorithm/test/", sort In 4, impcur equals connect ( host=" kudu3 ", port=21050, database="yingda_test", password=none, user='admin', use_http_transport=True).cursor() file "/usr/local/conda3/envs/py37/lib/python3.7/ site-packages/impala/", line 129, here in cursor session = self.service.open_session(user, configuration) file "/usr/local/conda3/envs/py37/lib/python3.7/site- packages/impala/", line 1187, in open_session or implies self._rpc('OpenSession', req, True) File "/usr/local/conda3/envs/py37/lib/python3.7/site-packages /impala/", enter 1080, in response _rpc implies self._execute(func_name, request, retry_on_http_error) file "/usr/local/conda3/envs/py37/lib/python3.7/site-packages/impala/", radius 1142, in the format _execute(self.retries)) impala.error.HiveServer2Error: 3 attempts failed

Your help is needed!
I read all the documentation I found online (stackoverflow, github, etc.) but absolutely nothing helped.
I'm trying to login to help you with hive(hue) python from my PC, my script is:

 traceback (usually also called .last):File "C:/Users/myuser/Documents/Python/", line f, in = hive.connect('myconnect', port=10000, username='root').cursor()
 File "C:\Users\myuser\AppData\Local\Continuum\anaconda3\lib\site-packages\pyhive\", line 94,
 connect everything
 Reconnect (*args, **kwargs)
 File "C:\Users\myuser\AppData\Local\Continuum\anaconda3\lib\site-packages\pyhive\", line
 192, go back to
 Line 79, message opened inside = ("Failed to start SASL: %s" %self.sasl.getError()))
 thrift.transport.TTransport.TTransportException: Failed to start SASL: sasl_client_start error in
 (-4) SASL(-4): mechanism not available: callback not found: 2'

: Python 3.7.4

: Anaconda Inc. on win32

We have more problems with Anaconda when using the Hive environment. The Hive version has always been Ambari 2.7.4 3.1.0, a responsive multi-node cluster. The Python connection in Hive works fine from the RHEL server. But the same goes for an environment that never connects via Anaconda on Windows10. Conda version 4.9.0. Please find the exact error below

TTransportException: failed to start SASL: b'Error in sasl_client_start (-4) SASL(-4): mechanism not available: could not find obrother call: a 2'

We have a virtual private network where we can install Windows Server and a Hortonworks-based Hadoop distribution on various data nodes in Redhat.

A Python script is installed on the Windows machine in our group and is trying to access the Hive tables available on the specified Hadoop.Hive cluster

port number 10000 is indeed open to listen for more requests.

In the Windows dialog, the online ODBC connection works fine with the location, but the python script below no longer fails with errors.

To do this, we have set up some mandatory special functions required for most Hive connections, as described below;

Please find this code:

 from pyhive import hive
from TCLIService.ttypes import TOperationState
Import savings
import pandas to pd
import pyhs2

Cursor implies hive.connect(host='dnanoripaihos01.retailaip.local', auth='KERBEROS', port='10000',kerberos_service_name='hive').cursor()
cursor.execute('SELECT * FROM undo LIMIT 50')

When one of us runs this script, it outputs the followingerroneous:


Restoring (last selected last): File "C:\Users\rekha.b.gaonkar\Desktop\", line 9, in only Cursor implies hive.connect(host='dnanoripaihos01.retailaip.local', auth='KERBEROS', port='10000',kerberos_service_name='hive').cursor() File "", line En 94, connect Reconnect (*args, **kwargs) File "", path 192, in __init__ File "", line 79, in open message=("Failed to start SASL: p.c %s" self.sasl.getError())) thrift.transport.TTransport.TTransportException: SASL cannot be started: b'Error in sasl_client_start (-4) SASL(-4): mechanism absolutely unavailable: callback cannot be identified: 2'