|All requests||>||extract updated data||>||Request new recommendation||>||Featured requests||>||No recommendations|
by kintu micheal - 3 years ago (2017-10-31)
being able to sync local content database with online hosting database
1. by Maik Greubel - 3 years ago (2017-11-01) Reply
Syncing only data from one database host to another could be a problem in case of foreign key constraints are in play.
You need a script, that checks first whether a given primary key already exist in destination database. In case of pk already exist, you need to generate a new one and update references of all foreign keys, where the pk is used.
But I like this request and want to know, whether I'm able to fullfil that.
2. by kintu micheal - 3 years ago (2017-11-03) in reply to comment 1 by Maik Greubel Comment
@ Malik Greubel thanks for your reply it good idea so explain more and if possible with a recommend class how am gona achieve this
3. by Maik Greubel - 3 years ago (2017-11-05) in reply to comment 2 by kintu micheal Comment
Sorry, I have no class yet, but I want to describe you the problem.
Lets assume, that you have a table 'customer' with identical structure in both databases, your production and development. But the datasets are different. Lets assume, that you have a customer "John Doe", which uses primary key 123 as ID in your development database. And in your production database the ID 123 is also set but for a customer "George Miller". Now you have a constraint violation because a primary key must be unique, and synchronization from one database to another is not possible because database engine will throw an error if you try to insert an existing unique key.
Update of production database is also not possible, because you would break existing and/or closed orders and this is forbidden in most countries due to law.
So you could only update the development database - which is probably also not wanted - because you may change existing automatic test behaviour.
The only way to perform a vice-versa-synchronization is to insert non-existing rows from one database to another, which means you have to provide new primary keys for existing data.
So let's assume, you have the George Miller account which has 3 connected orders. You need to change the reference from order to customer to keep references and test data consistent.
So you can see, synchronization of existing data from one database to another is no trivial job, in case of both database records are modified.
4. by Dave Smith - 3 years ago (2017-11-06) in reply to comment 3 by Maik Greubel Comment
re-indexing is probably not the best approach since you can run into a race condition.
It seems to me that the index control has to remain with the production database. To accomplish this, any remote crud operations need to be made through an API call that updates the production and will return a response for the remote script to process.
If, for example, I want to insert a new record into the local database...
1) I make a call to the API to insert the new record. 2) The production server inserts the new record. 3) It return the success response and the new record key value. 4) The local script inserts the new record using the master key value.
If I am updating a single record, I send the update request through the API which updates the production and returns all values for that record to sync the local record.
If the query updates multiple records, then I would just get a success response and run the update query on the local database, to avoid large api responses.
When the local environment is fired up, its database will be out of sync with the production. This can be handled with a data dump or record comparison.
The biggest problem is how to keep the local database synced when changes are made directly to the production. You would need a method to fetch and update the local database when missing keys are identified from an API insert request.
I don't think it would be possible to keep a local database 100% in sync with a production database, however the API approach would keep control with the production so that you don't really mess up production data.
This package may help : This package implements a client to get MySQL replication events in pure PHP.
It establishes a socket connection to a MySQL binary log server to so it can listen to MySQL database changes events like insert, update, delete queries with their data as well raw SQL queries.
The events can be dumped for debugging purposes or used to replicate the database on a new server.
1. by kintu micheal - 3 years ago (2017-11-06) Reply
if u would recommend the way to implement this package in the system from online hosting database which should be listening to events from alocal database
2. by Dave Smith - 3 years ago (2017-11-06) in reply to comment 1 by kintu micheal Reply
The production (online) server can't access your local environment, it can only access remote servers in the network.
The database sync has to be initiated from your local environment to the production server, or a batch file system needs to be in place to upload batch changes to the production server if the data to be synced will be coming from more than one remote database.
3. by kintu micheal - 3 years ago (2017-11-06) in reply to comment 2 by Dave Smith Reply
yah thats true have got the logic so can help explain to me how to implement mysql replication in that concept and local database is the slave
4. by Dave Smith - 3 years ago (2017-11-06) in reply to comment 3 by kintu micheal Reply
Maik Greubel explained one of the problems to overcome and as he said, it is not a trivial job.
I haven't used this replication package, however it can only listen to databases in the network and your local database is not. So... I don't think it will do what you need. I will add a new comment outside of this package recommendation.