Export large datatabase from remote
mysqldump -p -u dbuser -h dbhost dbname \
--single-transaction --compress --quick --max_allowed_packet=512M \
--no-data report_event \
report_viewed_product_index | gzip -9 > dbname.sql.gz
If this doesn’t work. Use this:
https://gitlab.com/chung1905/pysqldump/
* Export table by table -> prevent network disconnect
* Export parallel multiple tables -> faster
Note:
* Maybe optimize import by disable auto_commit, forein_check, unique_check,… https://dba.stackexchange.com/questions/98814/mysql-dump-import-incredibly-slow-on-my-developers-machine
* Skip triggers and some log table: https://stackoverflow.com/questions/5109993/mysqldump-data-only
* Doesn’t need to concat all tables file into one, just use something like: zcat output/dbname/*.sql.gz
binary bug when importing
https://stackoverflow.com/questions/17158367/enable-binary-mode-while-restoring-a-database-from-an-sql-dump/23569985#23569985
MySQL error 1449: The user specified as a definer does not exist
Create missing user
CREATE USER 'magento'@'%' IDENTIFIED BY 'password';
GRANT ALL PRIVILEGES ON *.* TO 'magento'@'%';
FLUSH PRIVILEGES;
Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
javax.net.ssl.SSLHandshakeException: No appropriate protocol (protocol is disabled or cipher suites are inappropriate).
fix:
set value for enabledTLSprotocols to TLSv1,TLSv1.1,TLSv1.2,TLSv1.3