Hi to all!
I'm developing a .NET application in which I connect to MySQL database. I use the DECODE function to decode certain passwords from the DB.
I've tested my application with a DB on localhost and with the DB on other two servers.
The application works fine (decodes the data) on localhost, BUT it does not work when I want to use databases on other hosts. The problem is that when I connect to other hosts the execution of my query (containing the DECODE function) returns different bytes and subsequently the decoded password isn't correct.
At first, I thought that the different versions of MySQL servers (on localhost and the other hosts) is the problem. Now I have version 5.0 no all machines. Well, I still do not succeed in finding the solution of my problem.
Also, when I run the application on the other servers (though they are localhosts) I still can't decode the data correctly...
I searched the internet, but I didn't find anything.
Can someone help?
greetings
I'm developing a .NET application in which I connect to MySQL database. I use the DECODE function to decode certain passwords from the DB.
I've tested my application with a DB on localhost and with the DB on other two servers.
The application works fine (decodes the data) on localhost, BUT it does not work when I want to use databases on other hosts. The problem is that when I connect to other hosts the execution of my query (containing the DECODE function) returns different bytes and subsequently the decoded password isn't correct.
At first, I thought that the different versions of MySQL servers (on localhost and the other hosts) is the problem. Now I have version 5.0 no all machines. Well, I still do not succeed in finding the solution of my problem.
Also, when I run the application on the other servers (though they are localhosts) I still can't decode the data correctly...
I searched the internet, but I didn't find anything.
Can someone help?
greetings