I am getting this Attribute Error : 'No Type' objectives has no attribute 'split' when i tried to migrate sql db to mysql db in mysql workbench
these are the Log details :
Starting...
Connect to source DBMS...
- Connecting to source...
Connecting to Mssql#sa...
Opening ODBC connection to Driver=sa;DATABASE=;UID=sa;PWD=XXXX...
Connected to Mssql# 11.0.2100.60
Traceback (most recent call last):
File "/Applications/MySQLWorkbench.app/Contents/PlugIns/db_mssql_grt.py", line 147, in connect
_connections[connection.__id__]["version"] = getServerVersion(connection)
File "/Applications/MySQLWorkbench.app/Contents/PlugIns/db_mssql_grt.py", line 174, in getServerVersion
ver_parts = [ int(part) for part in ver_string.split('.') ] + 4*[ 0 ]
AttributeError: 'NoneType' object has no attribute 'split'
Traceback (most recent call last):
File "/Applications/MySQLWorkbench.app/Contents/PlugIns/db_mssql_grt.py", line 174, in getServerVersion
ver_parts = [ int(part) for part in ver_string.split('.') ] + 4*[ 0 ]
AttributeError: 'NoneType' object has no attribute 'split'
Traceback (most recent call last):
File "/Applications/MySQLWorkbench.app/Contents/Resources/libraries/workbench/wizard_progress_page_widget.py", line 65, in run
self.func()
File "/Applications/MySQLWorkbench.app/Contents/PlugIns/migration_source_selection.py", line 406, in task_connect
raise e
SystemError: AttributeError("'NoneType' object has no attribute 'split'"): error calling Python module function DbMssqlRE.getServerVersion
*** ERROR: Error during Connect to source DBMS: AttributeError("'NoneType' object has no attribute 'split'"): error calling Python module function DbMssqlRE.getServerVersion
Traceback (most recent call last):
File "/Applications/MySQLWorkbench.app/Contents/Resources/libraries/workbench/wizard_progress_page_widget.py", line 543, in update_status
task.run()
File "/Applications/MySQLWorkbench.app/Contents/Resources/libraries/workbench/wizard_progress_page_widget.py", line 80, in run
raise e
SystemError: AttributeError("'NoneType' object has no attribute 'split'"): error calling Python module function DbMssqlRE.getServerVersion
*** ERROR: Exception in task 'Connect to source DBMS': SystemError('AttributeError("\'NoneType\' object has no attribute \'split\'"): error calling Python module function DbMssqlRE.getServerVersion',)
Failed
i tried a solution (given below , taken from the link https://bugs.mysql.com/bug.php?id=66030&thanks=3¬ify=195). but it didn't help. I still get the same error. Please help me.
solution:
// We'll need some help from you to diagnose this one. With a text editor, open the /Applications/MySQLWorkbench.app/Contents/PlugIns/db_mssql_grt.py file
and around line 174 you'll find a line that looks like:
ver_string = execute_query(connection, "SELECT SERVERPROPERTY('ProductVersion')").fetchone()[0]
Change that to:
ver_string = execute_query(connection, "SELECT CAST(SERVERPROPERTY('ProductVersion') AS VARCHAR)").fetchone()[0]
Then save and retry. Thanks! //
Related
Traceback (most recent call last):
File "C:\Users\josej\AppData\Local\Programs\Python\Python310\lib\site-packages\mysql\connector\abstracts.py", line 553, in config
DEFAULT_CONFIGURATION[key]
KeyError: 'datebase'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\josej\proyectos\holamundo\curso\db.py", line 3, in
midb = mysql.connector.connect ( host="localhost", user="josejan21", password="123JOSE123jan#gmail", datebase="prueba")
File "C:\Users\josej\AppData\Local\Programs\Python\Python310\lib\site-packages\mysql\connector_init_.py", line 272, in connect
return CMySQLConnection(*args, **kwargs)
File "C:\Users\josej\AppData\Local\Programs\Python\Python310\lib\site-packages\mysql\connector\connection_cext.py", line 94, in init
self.connect(**kwargs)
File "C:\Users\josej\AppData\Local\Programs\Python\Python310\lib\site-packages\mysql\connector\abstracts.py", line 1049, in connect
self.config(**kwargs)
File "C:\Users\josej\AppData\Local\Programs\Python\Python310\lib\site-packages\mysql\connector\abstracts.py", line 555, in config
raise AttributeError("Unsupported argument '{0}'".format(key))
AttributeError: Unsupported argument 'datebase'
There's a typo in your code, in the mysql connect method you are passing in "datebase" instead of "database" as an argument.
I am having problems using a MySQL database as a metadata database in a TensorFlow Extended pipeline.
I used the penguin template to set up a very simple pipeline and also set up a MySQL database locally.
The only thing I changed in the code was using
tfx.orchestration.metadata.mysql_metadata_connection_config as the metadata_connection_config input to the pipeline instead of tfx.orchestration.metadata.sqlite_metadata_connection_config
from tfx import v1 as tfx
# [...] Add a simple CsvExampleGen
tfx.dsl.Pipeline(
pipeline_name=pipeline_name,
pipeline_root=pipeline_root,
components=components,
metadata_connection_config=tfx.orchestration.metadata
.mysql_metadata_connection_config(
host="localhost",
port=3306,
database="ml_metadata",
username="root",
password="password"),
beam_pipeline_args=beam_pipeline_args,
)
Running this code results in the following error message:
[2021-11-17 12:02:16,948] {taskinstance.py:1270} INFO - Marking task as FAILED. dag_id=penguin_pipeline, task_id=CsvExampleGen, execution_date=20211117T110208, start_date=20211117T110212, end_date=20211117T110216
[2021-11-17 12:02:16,966] {standard_task_runner.py:88} ERROR - Failed to execute job 101 for task CsvExampleGen
Traceback (most recent call last):
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/task/task_runner/standard_task_runner.py", line 85, in _start_by_fork
args.func(args, dag=self.dag)
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/utils/cli.py", line 92, in wrapper
return f(*args, **kwargs)
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 292, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 107, in _run_task_by_selected_method
_run_raw_task(args, ti)
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 180, in _run_raw_task
ti._run_raw_task(
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1332, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1458, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1514, in _execute_task
result = execute_callable(context=context)
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/operators/python.py", line 151, in execute
return_value = self.execute_callable()
File "/home/user/anaconda3/lib/python3.8/site-packages/airflow/operators/python.py", line 162, in execute_callable
return self.python_callable(*self.op_args, **self.op_kwargs)
File "/home/user/anaconda3/lib/python3.8/site-packages/tfx/orchestration/airflow/airflow_component.py", line 76, in _airflow_component_launcher
launcher.launch()
File "/home/user/anaconda3/lib/python3.8/site-packages/tfx/orchestration/launcher/base_component_launcher.py", line 191, in launch
execution_decision = self._run_driver(self._input_dict, self._output_dict,
File "/home/user/anaconda3/lib/python3.8/site-packages/tfx/orchestration/launcher/base_component_launcher.py", line 152, in _run_driver
with self._metadata_connection as m:
File "/home/user/anaconda3/lib/python3.8/site-packages/tfx/orchestration/metadata.py", line 152, in __enter__
raise RuntimeError(
RuntimeError: Failed to establish connection to Metadata storage with error: mysql_real_connect failed: errno: , error:
In this log I was using AirFlow, but the same exception message is shown if I run a LocalDagRunner:
Exception has occurred: RuntimeError
Failed to establish connection to Metadata storage with error: mysql_real_connect failed: errno: , error:
I have tried changing the host to "127.0.0.1", but this didn't change anything. Anyone had a similar issue or maybe see an apparent error in my approach?
The MySQL server is version 8.0 and I am using TensorFlow Extended version 1.4.0
I followed this Vertex AI tutorial. However, at the last step, as the Cloud Function calls the prediction endpoint, it gets this failure.
This means it could not even access the metadata server. I.e., is not a permissions failure (though I did check that the myproject#appspot.gserviceaccount.com service account does have Project Editor role as specified). It is also an error strictly in Functions and IAM, not in Vertex.AI or other ML systems.
What is going wrong here?
Function execution took 673 ms, finished with status code: 500
Prediction request failed: <class 'google.api_core.exceptions.ServiceUnavailable'>: 503 Getting metadata from plugin failed with error: ("Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/myproject#appspot.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform from the Google Compute Enginemetadata service. Status: 500 Response:\nb'Could not fetch URI /computeMetadata/v1/instance/service-accounts/myproject#appspot.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform\\n'", <google.auth.transport.requests._Response object at 0x3e095a9f4c50>)
google.auth.exceptions.RefreshError: ("Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/myproject#appspot.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform from the Google Compute Enginemetadata service. Status: 500 Response:\nb'Could not fetch URI /computeMetadata/v1/instance/service-accounts/myproject#appspot.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform\\n'", <google.auth.transport.requests._Response object at 0x3e095a9f4c50>)
File "<string>", line 3, in raise_from
six.raise_from(new_exc, caught_exc)
File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/credentials.py", line 117, in refresh
self.refresh(request)
File "/env/local/lib/python3.7/site-packages/google/auth/credentials.py", line 133, in before_request
self._request, context.method_name, context.service_url, headers
File "/env/local/lib/python3.7/site-packages/google/auth/transport/grpc.py", line 88, in _get_authorization_headers
callback(self._get_authorization_headers(context), None)
File "/env/local/lib/python3.7/site-packages/google/auth/transport/grpc.py", line 101, in __call__
context, _AuthMetadataPluginCallback(callback_state, callback))
File "/env/local/lib/python3.7/site-packages/grpc/_plugin_wrapping.py", line 78, in __call__
Traceback (most recent call last):
The above exception was the direct cause of the following exception:
google.auth.exceptions.TransportError: ("Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/myproject#appspot.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform from the Google Compute Enginemetadata service. Status: 500 Response:\nb'Could not fetch URI /computeMetadata/v1/instance/service-accounts/myproject#appspot.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform\\n'", <google.auth.transport.requests._Response object at 0x3e095a9f4c50>)
response,
File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/_metadata.py", line 187, in get
token_json = get(request, path, params=params)
File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/_metadata.py", line 263, in get_service_account_token
request, service_account=self._service_account_email, scopes=scopes
File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/credentials.py", line 113, in refresh
Traceback (most recent call last):
AuthMetadataPluginCallback "<google.auth.transport.grpc.AuthMetadataPlugin object at 0x3e0961671dd0>" raised exception!
C:\Users\abhi1702\Desktop\selenium\venv\Scripts\python.exe
C:/Users/abhi1702/Desktop/selenium/app.py
Traceback (most recent call last):
File "C:\Users\abhi1702\Desktop\selenium\venv\lib\site-packages\selenium\webdriver\common\service.py", line 72, in start
self.process = subprocess.Popen(cmd, env=self.env,
File "C:\Users\abhi1702\AppData\Local\Programs\Python\Python38-32\lib\subprocess.py", line 854, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "C:\Users\abhi1702\AppData\Local\Programs\Python\Python38-32\lib\subprocess.py", line 1307, in _execute_child
hp, ht, pid, tid = _winapi.CreateProcess(executable, args,
FileNotFoundError: [WinError 2] The system cannot find the file specified
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:/Users/abhi1702/Desktop/selenium/app.py", line 5, in <module>
chrome= webdriver.Chrome(executable_path='Users\abhi1702\Desktop\chromedriver\chromedriver.exe')
File "C:\Users\abhi1702\Desktop\selenium\venv\lib\site-packages\selenium\webdriver\chrome\webdriver.py", line 73, in __init__
self.service.start()
File "C:\Users\abhi1702\Desktop\selenium\venv\lib\site-packages\selenium\webdriver\common\service.py", line 81, in start
raise WebDriverException(
selenium.common.exceptions.WebDriverException: Message: 'chromedriver.exe' executable needs to be in PATH. Please see https://sites.google.com/a/chromium.org/chromedriver/home
Answer is in probably your post:
Message: 'chromedriver.exe' executable needs to be in PATH
So please try adding chromedriver.exe to your system variable PATH.
For Windows, open System Properties - Environment Variables - and modify System variable "PATH": click Edit, add new entry with path to chromedriver.exe, for example: C:\drivers\chromedriver.exe
I'm trying to use the rabbitmq_parameter ansible module to set a federation upstream set, while dynamically generating the set, with something like this:
- name: Set federation upstream set
rabbitmq_parameter:
component: federation-upstream-set
name: my-upstreams
vhost: my-vhost
value: "{{ my_upstream_set }}"
The variable my_upstream_set is defined in a separate host variable file, like so:
my_upstream_set: [{"upstream": "upstream1"}, {"upstream": "upstream2"}]
However, no matter how I generate the value argument, which must be json, (with or without quotes, with simple or double quotes, yaml or json formatted), I can't get this to work. I get either the task failing with "stderr: Error: JSON decoding error", or the following error:
failed: [myhost] => {"failed": true, "parsed": false}
invalid output was: Traceback (most recent call last):
File "<stdin>", line 1498, in <module>
File "<stdin>", line 142, in main
File "<stdin>", line 104, in set
File "<stdin>", line 88, in _exec
File "<stdin>", line 1351, in run_command
File "/usr/lib/python2.7/posixpath.py", line 261, in expanduser
if not path.startswith('~'):
AttributeError: 'list' object has no attribute 'startswith'
debug3: mux_client_read_packet: read header failed: Broken pipe
debug2: Received exit status from master 1
I've tried running the task with a hardcoded value (so, directly in the task file) and it works as expected, but I have no way of integrating variables into that. Any idea what I might be doing wrong here? Thanks!
my_upstream_set should be a JSON (string) in your case it is a list.