We recently decided to make a public release of an old web-based application coded in ASP (classic, using VBScript) and using Microsoft Access. In order to make this application public we need to make a few modifications, not the least of which is moving from Microsoft Access to MySQL. Using Microsoft Access on the back-end would significantly hamper the ability of the application to support concurrent users, among other issues.
The majority of the coding modifications have yet to be made, but the database switch has already occurred. In the process of moving from MS Access to MySQL I discovered a few settings that would be helpful should this action need to be performed for other applications. These settings should enable similar applications to be moved with minimal modification to the programming.
First, let’s review some settings related to the MySQL ODBC driver. The settings are relevant to all versions of the driver, but the name of the setting may be different on different versions (I’m using 5.1.6). Here are the options which should be selected:
- Return matched rows instead of affected rows
- Treat BIGINT columns as INT columns
- Enable safe options
The following information relates more generally to changes that may have to be made in the code:
- MySQL doesn’t really support server-side cursors so the ODBC drivers fakes it. This is, mostly, fine except that some properties of the Recordset object are not available (namely RecordCount). In order to get full cursor support you should change the location from the server to the client (
adUseClient or the literal value 3).
- ASP doesn’t understand non-signed integers. This causes problems when performing operations using these values unless you manually type the value in your script, e.g.
scriptvar = CInt(objrs("dbcol")). The other solution is to make all integers signed. Otherwise you will see the error: Variable uses an Automation type not supported in VBScript.
- Related to the above is the usage of values from the database in comparison functions. VBScript variables are typed (e.g. integer, string, boolean, etc.). Though you can’t specify the type during variable instantiation (with
Dim) VBScript does pay attention to type when performing comparisons. When two variables of different types are compared you will get a “Type mismatch” error. The resolution is to
Ctype your variables if you run into this type of error.
- Finally, check your SQL statements for any VBA function calls. These will either have to be modified into MySQL-compatible function calls or removed from the SQL code altogether.
There are a number of issues that may be encountered when attempting to convert an ASP-based application from MS Access to MySQL. The issues addressed here are only those relevant to this particular application. Other applications may require additional or different solutions and settings.
As of cakePHP 188.8.131.5204 you can’t use CURRENT_TIMESTAMP as the default for a column. With MySQL when the default is set to CURRENT_TIMESTAMP the current date+time is inserted for a new row that doesn’t specifically define the value of the column.
From what I can tell, when CakePHP specifies the values for all columns when it creates a new record. A column with a defined default that is not specifically set by the user is manually set by CakePHP (rather than let MySQL handle defaults upon record insertion). But CakePHP doesn’t understand the CURRENT_TIMESTAMP keyword and so treats it as a string and wraps it in quotes. This breaks the resulting INSERT statement and you receive an error:
Incorrect datetime value: ‘CURRENT_TIMESTAMP’
Interestingly, columns that are named “created” and “modified” receive special handling by CakePHP. These columns are treated like auto-update columns by CakePHP and it sets them as expected. With the special handling of these columns in mind it is possible to get around the CURRENT_TIMESTAMP bug by following the recommended settings for created/modified columns, i.e. specifying the field as DATETIME with a default of NULL. CakePHP will automatically update the columns when inserting/updating records.
With a large set of test packets in need of transcription, little time to do so, and even less desire to spend the money on temporary workers, I needed to think of some other way to perform the task. Luckily I was able to come up with a method using a tool from Amazon that proved relatively easy to implement. Mechanical Turk .
Continue reading The Mechanical Turk Knows All
I was trying to find a way to update some rows in a MySQL table where the condition for performing the update was based on a column in another table. Normally this isn’t an issue, just run
UPDATE table ... WHERE key_column IN (SELECT key_column FROM ...).
I was having some problems, however, because the table that needed to be updated uses a mutli-column primary key. Normally I would select the relevant rows using a
JOIN. Unfortunately, you can’t update the results of a
JOIN. How then to select the relevant rows?
Turns out it’s easier than I thought. You can still use the
WHERE ... IN clause, you just specify all of the columns that satisfy the condition:
UPDATE table ... WHERE (key_column1, key_column2) IN (SELECT key_column1, key_column2 FROM ...).
I did not see any information about this functionality in the MySQL Reference Manual (1, 2).
After creating a new structure for the current year’s piloting data there has been a bit of disconnect in development between the two data sets. There’s just not enough time to ensure that everything works across both data sets. There were basically two choices to moving forward: 1) construct a view of the new data structure that mimics the old data structure; 2) port the old data structure to the new one and continue redevelopment of the scripts. I chose the latter, mainly because there are some improvements I’d like to make to the interface in the process.
I created a series of SQL statements to run in MySQL that will convert the data from the packet_item_records and miscon_packet_refs tables to the packet_students, packet_data, and miscon_packetdata_refs tables. After the conversion I updated any pages that referenced the old data structure. So far in testing the data seems to have converted perfectly.
Minus one issue. Multiple selections for the answer choice questions (A, B, C, D) from 2006 were recorded as a generic Multiple rather than Y+NS, N+NS, etc. This value is not represented in the updated data format or on the data entry forms or summary tables. Rather then spend too much time addressing this issue I’m going to leave these values empty for now. I don’t expect this to be a problem since the researchers are focused on data for the current pilot and field tests. Also, I’m keeping the old version of the piloting data and scripts that interact with it online in case it’s needed.
While working on the Benchmarks search I wanted to try and provide a feature I find useful on Google and other search engines: word form expansion (lemmatisation). A little research showed to me that this would require more work than we really should be spending on search functionality. Especially considering that the built-in MySQL full text search capability is sufficient for our needs. So I decided to focus on a feature that would still provide value but require little time: word stem expansion.
Continue reading Expanding Search Terms for More Inclusive Results
Since we’ll be exposing MySQL to significantly more traffic (due mainly to the transition to a database-driven version of Benchmarks Online [dbBOL]) I decided to spend some time optimizing the server’s settings. There are a number of settings that can be tweaked to improve performance. I based my decisions on the information available from the references cited and the performance statistics reported by MySQL (SQL
SHOW VARIABLES or use PHPMyAdmin). MySQL has been running for 131 days as of the writing of this post (see cached copy of the runtime stats), so I expect the data will be a fairly good indication of the performance of MySQL under its current usage. Unfortunately, I expect the usage pattern to change significantly once dbBOL is released. As a result some of the settings used will be based on expected usage patterns. At specific intervals after dbBOL is released we should examine the performance of MySQL based on the runtime stats to determine if additional tweaking needs to be performed. I recommend the following schedule: 1 week, 1 month, 3 months, then every 6 months.
Continue reading Best Practices for Setting MySQL Server Runtime Parameters
Jill is working on a new topic (
Processes that shape the earth/Plate Tectonics Version II (PT)) which shares a number of misconceptions with another topic (Earth Science: Processes that shape the Earth / Plate Tectonics (EP)).
Problem: She changed the topic of those misconceptions believing it would be available to both topics. Once she realized this was not the case, she changed the topic back, but the numbering was off.
Solution: She provided me a list of which misconceptions were out of sync. Because adding a misconception involves updating multiple tables I actually had to add the misconceptions through the Item interface and then change the numbering on the backend.
Conclusion: The conclusion is that the conclusion is pretty obvious. This functionality needs to be revised. Changing the topic of a misconception has considerable consequences throughout the Items utility, so I think it may be necessary to add an additional step to this process. For example, have the current topic displayed along with a check box and a disabled/hidden topic menu. Then use the checkbox to enable/disable the menu and possibly some kind of textual warning (“Changing this is likely to break something.”).
Another note is that the ‘code’ column denoting the numbering sequence is not a unique column so we could have the numbering be editable on the interface. Although this could have some duplication complications.
I also found it strange that when submitting a new misconception, there is no notification and you are not redirected back to the miscon list. I found out the hard way that once you submit a misconception you are essentially in edit mode (not Add New Miscon anymore) and any changes made are being made to the record you just added. It wasn’t obvious to me the difference between Add and Edit mode.
As I was meddling with registrant_list_demo.php I found that the demographics were not showing up for any of my test cases. I was under the mistaken impression this data was obtained from an external server, but it is actually taken from a data dump from the nces stored on our server in a database called ‘demographics’ (go figure). The demographics data appears to be sorted by zip code. I am surmising that this data dump is requested from nces after the registration process is complete and an updated list of zip codes has been generated.
I’m sure Mr. Sweeney can provide details of this request procedure upon his return. The demographics script is working fine, but the demographics data is most likely out if date. Cari indicated the demographics data will not be needed for a few weeks.
When I first opened up the register dir I found register.php, review.php and review.v1.php. Based on script and notes in the script the registration and review registration processes were consolidatedby Mr. Sweeney in register.php, but perhaps at one point they were separate. review.php and review.v1.php turned out to be the registrant listing scripts.
This names were confusing so I renamed them to registrant_list.php and registrant_list_demo.php (demographics).
I added a filter (by year) to limit output results. However it is not technically filtering by year. There was already a date limit written into the existing query (2007-04-27, I think) which I used as the start range marker for 2007. The start range for 2008 is 2008-01-01. The messy part is that everything prior to 2007-04-27 is marked as 2006 in the filter when in fact the records only go back to 2007-04-17. I expect to be working on this script again in the near future (see “Updates to field test registrant_list_demo”) at which point I will probably change the filter to just ‘Prior’.