Tuesday, December 16, 2008

My Favorite FireFox Add-ons

Using a browser like Google Chrome reminds me how addicted I am to my FireFox add-ons. I use 15 or so, but these are by far my favorites:

Delicious Bookmarks

Can't go without having the same bookmarks on every PC, laptop and phone I use. The search and sidebar make this a must-have.

IE Tab

Love switching to IE inside Firefox when a web page won't display correctly. Even better are the filters I can set to have it switch automatically once I know a site doesn't render well in Firefox.

TinyUrl Creator

Right click on a link and automatically have a tinyurl created and on the clipboard ready for pasting.

DownThemAll!

Cool utility that lets you download all images from a web page in a batch. Good for backing up a web site you don't have file access to (like a Blogger blog), or for getting every little image that makes up some web menu or other construct you'd like to emulate.

Adblock

The name pretty much says it all. Turns sites that bombard me with animated ads into static images my eyes and brain can comprehend.

Flashblock

I would imagine everyone has this already, but I'll mention it just in case. Great again for blocking sites with way too many animations.

Friday, December 12, 2008

2008 In Review

I'm only 11 months in, but I'm optimistic and declaring my first year of blogging a success. Not only am I still interested in blogging, I managed to post more information than I initially planned. My initial goal was 2 posts per month. So far in 2008 I managed 41 posts, although some of them were purely announcements and probably shouldn't count as content.   :)

Considering other commitments I have I am very please with 41 posts. These are my favorites from 2008:

Why Blogs are not as Lame as they Sound

A Database Trigger That Maintains an RSS Feed

Nitpicks Matter

Direct Table Access in ADO.NET

Updating Static Cursors

Delphi: Did You Know #1

Wednesday, December 3, 2008

ARC: Did You Know #5

Did you know you can double-click in the filter edit box of the table browser and get a larger filter dialog?



In addition to a larger buffer to enter text, the dialog includes options to save, print and load filter expressions.



Tuesday, December 2, 2008

Twitter Test

I've avoided Facebook and LinkedIn and still have no plans to join either. I have been avoiding Twitter as I don't quite understand how it provides more than some trivial value, but on a whim I have decided to give it a try.

OK, not entirely on a whim... Out of curiosity I searched Twitter for "Advantage Database", expecting zero results. I found a handful of posts including one by Marco Cantu, and I wanted to reply. I had to have an account to reply. And now I have an account.  :)

Now that I have passed the entry barrier I figured I would give it a try. If you feel like following me my Twitter account name is jeremym1234. Tell your friends.   :)

Wednesday, November 19, 2008

The Need for Feeds - Part 2

Not willing or don't think you have the time to subscribe to all of the blogs I listed in my Need for Feeds post? No problem, Google Reader has cool sharing functionality. When I read a post that I think is especially helpful or interesting, I will mark it as shared. All you have to do is subscribe to my shared feed, and you can see the items that I mark as shared.

Consider it the "J.D. filter". I read all of my blogs, weed out the noise, and you get a single feed with just the good stuff I find.

My shared items web page:  http://www.google.com/reader/shared/15644120060250304199

The RSS feed:  rssimage

Tuesday, November 18, 2008

"Scripting" Languages and Utilities

OK, calling them "scripting" languages these days is a disservice. What I would traditionally call a scripting language is a language that doesn't require a compiled executable, is dynamically typed, has great string manipulation functions or classes, and has regular expression support. These days there are tons of choices, and most of them are full fledged languages with functional frameworks that can accomplish anything a compiled language can. This post is a summary of a recent experience with some "scripting" languages, and the cool tools I found to make the experience much more enjoyable.

 

I recently worked on two tasks that together put me in front of Python, Perl, and Ruby. I like it when I have a task that requires using any of these languages, but the problem is I dabble in them so infrequently that I often spin my wheels getting back up to speed. I fumble through old scripts stealing bits and pieces of code. I litter print statements here and there so I can tell what is going on. I basically revert to my freshman year of college in CS101 labs.

 

Now that I'm a "professional", however, at the first sign of trouble I started looking around for tools to help me figure out what was going on. My first dilemma centered around a Python script that was stopping at an unexpected error. I didn't write the script, I really needed it to work, and I don't know a thing about Python. What I really needed was a debugger to help me browse around the script and quickly understand the data structures it was using and what it was trying to do when it failed.

 

A command line debugger was not going to do the trick. I have no problem resorting to something like gdb or Perl's command line debugger when I really have to, but those debuggers are really at their best when the developer is already familiar with the script and has a bit of an idea of what they are up against. A visual debugger is a must when you are stepping through code for the first time and just trying to get a feel for what is going on.

 

A few web searches under my belt, and I was happy to find winpdb, a platform independent visual debugger for Python. The debugger was easy to use, and had a great watch window that not only showed local variables as they entered and left scope, but also correctly displayed control characters inside variables, which made debugging regular expressions much easier (click the picture below for a larger screen shot).

 

winpdb

 

In a separate task this week I needed to write a script to clean up some "garbage" characters that our legacy help file authoring system was randomly placing in our HTML help. I chose Ruby for the task, as I had written quite a few Ruby scripts in the past and I really like the language. I ended up stumped on a regular expression never matching text that I thought it should (and I knew the expression was sound because I had already tested it in my editor). While I could have resorted to some "printf debugging", my Python debugger experience had me anxious to find a visual debugger I could use with Ruby in this and future projects.

 

Google to the rescue and soon I was debugging my Ruby scripts inside Visual Studio. This thing is brilliant! The screen shots below are from Ruby in Steel. Note the black background is my Visual Studio preference, not an oddity of Ruby in Steel. While Ruby in Steel is not free, it is very inexpensive and I'll be buying a license this week. Debugging my Ruby scripts in the familiar Visual Studio environment put a huge grin on my face. It felt like cheating.

 

ruby1

 

ruby2

 

The real kicker for Ruby in Steel was the intellisense. Since I don't use Ruby all the time, when I do start writing a script I find myself constantly referencing the documentation to find class methods. I love being able to type a class name, hit dot, and browse for methods without leaving the editor.

 

ruby3

 

There are other .NET based implementations like IronPython and IronRuby which I imagine use the Visual Studio debugger as well, but for basic script debugging I found these utilities extremely useful.

Sunday, November 2, 2008

The Need for Feeds

I'm an RSS addict. I love reading technical blogs, and I seriously think my thought processes and career would be on a different track if it were not for the great content I have been exposed to over the last few years of blog reading. Not only has it helped me solve technical issues, but it almost always provokes thoughts and ideas on how I can improve our product, our teamwork, and our development processes.

With that said, something scary has been happening lately; Google Reader has had very few new posts for me to read. A lot of this has to do with my addiction at home to firing up mobile Reader on my phone and catching up during college football timeouts and commercials.

The solution? I propose a trade. I'm going to list some of my favorites, in return for your comments with some feeds you read that I might like. I seriously used to consider this list a bit of a competitive edge and was weary of giving it out. I recently had a change of heart and realized that was fairly lame, so here is my complete list of software related feeds:

Development

First the usual suspects:

Now a few you might not be aware of:

Software Business

ianywhere blogs

Web Development

FoxPro Blogs

Wednesday, October 29, 2008

Visual Studio /nosplash Option

Sara's latest Visual Studio 2005 tip about the /nosplash option is really cool. I didn't expect it to improve startup performance as much as it did. I was amazed at how much faster Visual Studio fires up now.

Friday, October 17, 2008

SWFox: Retrofitting Client Server Access

I'm not much of a "live" blogger, but I thought I would post my raw notes from the first session I attended today; Retrofitting Client/Server access, by Toni Feltman. It was a good class, and one I attended in order to learn the obstacles Visual FoxPro developers face when moving to client/server, and how Advantage can better address these concerns. It was nice to see a handful of pain points mentioned when migrating to other servers that do not exist when using Advantage.

These are my raw notes and observations:

RETROFITING CLIENT SERVER
Toni M. Feltman - F1 Technologies
Roughly 28 attendees (20% of conf attendees)
---------------------------------
Why moving to C/S
- tables too large, heard this a lot at the booth, Rick mentioned we should to show Advantage using a 6 or 7 GB table during our vendor session to drive home the point that we don't have a table size limit.
- security
- performance

Picking datasource:
- mssql
- oracle
- Advantage - she mentioned is one of the easiest ways to convert to C/S
- db2, sybase, others

Talked about the overhead in getting server set up, "tuned", hiring a DBA, etc. Need quick demo on how easy Advantage is to install and how it self tunes. Also talk about lower cost of ownership because of this.

Moving the data:
- Moving the data isn't necessary with Advantage. Our upsize wizard simply creates an Advantage .add that matches the existing DBC, it doesn't modify or import the dbf tables.

Remote Data Access Methods:

- remote views: maybe not the best, but the easiest, which we have seen with customers porting to Advantage. Those with remote views have been able to port to Advantage very easily.
- USE statment now pulls entire table to the server. Very poor performance unless you filter the data/view.
- don't have to write update/insert, normal data movement still posts records
- little more difficult to make the connection path dynamic

- cursor adapters
- Toni's favorite way to get to remote data.
- have to manually call TableUpdate to post changes
- easy to make connection path dynamic

- sql passthrough
- easy to make connection path dynamic
- stored procedure calls
- still have to pick a data access technology to execute them, but you then put all business logic in the procedures, the VFP app doesn't execute UPDATE or INSERT statements, the procedures do.

Coverage Log - logs what forms, tables, etc are used. How often lines of code are hit, etc. Basically a coverage profiler. Built into VFP. Could be useful when troubleshooting or when identifying areas of app that need tuning or have data access code that needs to be modified.

Showed cool form usage log. Windows hook to an event for formShow. Keeps log of forms used by the application. Not as nice as a full usage of menu items, but this is a quick an easy approach to get a high-level overview of form usage.

Connection sharing is important. By default you don't get this. Need to save connection handle and reuse it if you don't want to be using a lot of connections or connecting/disconnecting all the time.

She stressed remote servers don't have the concept of EMPTY(), which can make dealing with NULL values and empty values a bit of a pain when migrating to client/server. Advantage does support the EMPTY expression, however.

Monday, October 13, 2008

Southwest Fox 2008

I will be attending Southwest Fox 2008 this week with a few colleagues. If you will also be in Mesa be sure to stop by our booth in the exhibit area and say hi. In addition to product demos and answering Advantage questions, we will also be raffling off a Magellan GPS towards the end of the conference.

There will be two Advantage-specific sessions at Southwest Fox this year. There will be a vendor session presented by one of our Systems Consultants, Chris Franz. In addition there is a class titled Advantage Database Server for Visual FoxPro Developers, presented by Doug Hennig.

If you have any Advantage questions or features you would like to see demonstrated (either in person at the show, or here on my blog) let me know via a comment to this post or by e-mailing advantage@ianywhere.com and placing "attn:JD" in the subject.

Friday, October 10, 2008

Solid State Drives

This is a guest post by Mark Wilkins, a Senior Software Engineer on the Advantage R&D team.

Solid State Drives
Solid solid state drives (SSDs) present a new storage option that is becoming increasingly economically viable. Our colleagues from the SQL Anywhere team had mentioned that they had done some testing with solid state drives. Our own team was having an informal brainstorming session one day, and we were speculating about how these drives will affect the performance and characteristics of Advantage Database Server. Being the diligent and self-sacrificing developers that we are, and because we are good at running with other peoples' ideas, we immediately chased down some cash, bought an SSD, and tested it.

We purchased an OCZ Core Series 64GB 2.5" SATA II drive. Most of the testing was done on two different machines: a Dell Precision 380 3.2GHz desktop development PC with a 7200 rpm Seagate Barracuda SATA drive and a Dell PowerEdge 1950 1.83GHz quad core 64-bit Xeon server with a 15,000 rpm SCSI drive.

When holding the SSD in your hand, the form factor is a compelling feature. It is small, light and very desirable and is fun to carry around and show off to other developers. Of course, when it is dangling off of a SATA cable with no obvious resting place inside a desktop PC that has a bunch of fans and two noisy spinning hard disks, some of its sexiness gets lost in the mix. In the rack machine, I was able to slide it carefully into the far back reaches of the second hard drive bay and click it into place. It is currently still sitting there and will be until someone with long skinny flexible fingers can pry it out.

With minimal searching, one can find dozens of videos on the Internet purporting to show that SSDs are blazingly fast and the best thing since flying toasters. The common theme for these videos is to take to identical laptops, put an SSD in one and a "traditional" hard disk drive (HDD) in the other, point a video camera at them and have the laptops "race". A person will push a button on each laptop to simultaneously boot, load programs, eat bowls of Häagen-Dazs, etc. In the videos, the laptop with the SSD is always a clear winner. In our testing the SSD does perform very well, but its benefit does not seem to be as clear-cut as the videos would have you believe. Go figure. If the HDDs in the videos were extremely full, had minimally sized swap files and were extremely fragmented, then the videos might be realistic. To be fair, though, I don't have two identically configured laptops to try similar tests on, so I could well be wrong.

However, what we were really interested in was to see how Advantage behaved. The short answer is that Advantage performs very well with an SSD; we did not uncover any surprises. In some scenarios the HDD appears to have a slight edge and in others, the SSD wins. In many situations, the I/O caching performed by Advantage Database Server equalizes the performance of the two types of drives. In general, the test results were fairly predictable. For example, the near-zero latency and seek times of an SSD provides for the ability to read random portions of data from the drive very fast. Using this knowledge, one can construct tests that benefit from that behavior.

Some of the testing we performed was to run TPC-C transactions over some interval of time with a number of clients. These tests provide a reasonable cross section of queries with a mix of reads and writes across multiple tables. We captured a lot of different test result numbers, and there was no clear winner with the hardware we tested. For example, In one case, we ran 50 TPC-C clients against the quad core server for 5 hours. When running against the SSD, the test performed about 10% more iterations than when running against the HDD. However, when we ran the same test scenario for very short periods (e.g., 1 minute), the HDD typically outperformed the SSD by up to 10%.

In the absence of an obvious winner based on numbers, we were able to resort to trend spotting. Some situations in which the SSD seems to outperform a traditional HDD include the following.
  • Long sustained reads in natural record order.
  • Long sustained update and append operations with multiple indexes being updated.
  • The very first query against an unread (non-cached) table that involves index usage. Once indexes get cached, though, the difference disappears.
  • Reading and updating fragmented indexes.
It ultimately comes down to usage patterns and will obviously vary by application. If, though, we assume that this one SSD that we tested is comparable to other SSDs, then we can conclude that the current crop of SSDs perform very similarly to HDDs and are a valid choice for data storage for Advantage Database Server.

Some Numbers
The following are a few of the numbers obtained during the testing. The next set of simple reindex, read, and append tests were all performed on the Precision desktop workstation.

Reindex 1 million record table (2 indexes):
SSD: 2,096 ms
7200 rpm HDD: 3,573 ms


Read through 1 million record table:
SSD: 2,295 ms
7200 rpm HDD: 2,828 ms


Append 100,000 records with 3 indexes and a memo:
SSD: 27,900 ms
7200 rpm HDD: 28,100 ms


Set AOFs (filters) with 25 clients for 60 seconds:
SSD: 33,006 filters
7200 rpm HDD: 38,006 filters


The next few are some numbers from the TPC-C tests. These particular tests involved 8 tables with updates to 5 of the tables. Each "iteration" was a single transaction that involved an average of 22 queries and 22 updates/inserts. Each of the tests had 50 clients running concurrently.

60 second test on the Precision workstation:
SSD: 2425 iterations
7200 rpm HDD: 1986 iterations


60 second test on the PowerEdge server:
SSD: 3763 iterations
15,000 rpm HDD: 4176 iterations


5 hour test on the PowerEdge server:
SSD: 1,459,455 iterations
15,000 rpm HDD: 1,347,630 iterations


As you can see, the results are somewhat mixed. In general, the SSD would slightly out-perform the HDD, but it was certainly not always a given. For example, in the numbers above, the 60 second test on the PowerEdge server consistently had the HDD performing more iterations, but longer test runs usually gave the nod to the SSD. The short test runs, though, on the desktop workstation would typically place the SSD as the winner. Also, the filter (AOF) test would consistently show the HDD as the winner. I don't have any good explanations. One possibility is that the tests on the workstation were not run under ideal situations. I generally had multiple applications (documents, editors, IDEs, etc.) open while the tests were running. I did not do anything to ensure that those applications would not suddenly decide during a test run to scan for updated files or phone home to see if a critical update had just been released by its vendor. Still, though, that type of situation reflects some real world situations under which Advantage is used.

If any of you have real world results involving solid state drives, it would be interesting to hear about them.

Tuesday, October 7, 2008

Online Poll about Advantage Licensing

Please take a moment to visit my base page at http://jdmullin.blogspot.com and answer the survey question in the top right portion of the page.

The question:
If the Advantage server valcode changed to be a license file (as opposed to a 5 digit valcode), but provided more self-service options and license management via the web, would you consider this a positve or negative change?

This file would be a few hundred bytes. It would be slightly harder to manage. The existing 5 digit strings are easy, allow you to print labels and stick them on product, etc. With a license file approach you would have more license management options via self-service web portals to purchase licenses, expand existing licenses, print entitlement reports, etc.

Along with voting, feel free to post comments with thoughts or concerns.

I'd also like to hear your thoughts on serial numbers. Love them or hate them?

Friday, October 3, 2008

Using the DEFAULT keyword in INSERT Statements

Did you know you can use the keyword DEFAULT when inserting rows via SQL? This can be handy if your table has a read-only field, an autoinc for example, and you don't want to specify a field list when inserting a row. You can use this:

CREATE TABLE tester ( id autoinc, name char(20) );
INSERT INTO tester VALUES ( DEFAULT, 'whomever' );


instead of having to specify a field list like this:

CREATE TABLE tester ( id autoinc, name char(20) );
INSERT INTO tester ( name ) VALUES ( 'whomever' );




If a column has a default field value, the DEFAULT keyword can be used to specify that default value should be used.

CREATE TABLE tester ( id autoinc, name char(20) DEFAULT 'unknown' );
INSERT INTO tester VALUES ( DEFAULT, DEFAULT );




Tuesday, September 30, 2008

Delphi 2009 Component Palette Categories

In Delphi 2009 the new default behavior for the tool palette is to start with all categories collapsed.



I don't like that behavior at all. I'd prefer the categories I always use to always be expanded. You can change the default behavior in the Tool Palette options. Uncheck the "Auto collapse categories" option.



Tuesday, September 16, 2008

Tips When Porting a Delphi Application to Delphi 2009

Over the past few months I have been keeping notes as I port some components and applications to Delphi 2009, specifically for the purpose of gaining unicode support. While my project is nowhere near complete, I wanted to post some tips. If you are considering porting to Delphi 2009 the CodeGear Developer Network has lots of great documentation describing the new unicode VCL and some unicode basics. This post is simply meant to supliment that information with issues I have encountered and is sprinkled with a few Advantage-specific notes as well.

Database Functionality and Changes

If the AsString method is used on field types that are not ansi-specific you will get unexpected results. For example, if you have a field of type ftBytes and you used to use MyByteField.AsString to set and get these values, you will now be getting unicode characters as opposed to single byte characters. The TBinaryField class has a new AsAnsiString method you will need to use in order to get the results you are used to.

If you read any TBlobStream data into character buffers, those buffers will need to be changed to AnsiChar buffers or TByte buffers, otherwise length/offsets/etc will not be what your existing code expects.

The Delphi database core has been modified and will now parse an SQL statement like the following:

CREATE TABLE c:\mytable ( id INTEGER );

and interpret it as having one parameter called \mytable. It will then automatically add an item to the query component’s Params property. In Advantage execution of the statement will then fail with the error “Error 5110: The parameter number specified was invalid for the statement.”, as really there is no parameter in this statement.

To work around this problem in the Delphi VCL, you must quote the table name. For example:

CREATE TABLE “c:\mytable” ( id INTEGER );

Reading ansi string data from a stream into a string buffer will not work. You need to explicitly read into a temporary ansi buffer first. See below:

The reason the temporary ansi buffer is necessary is because ReadBuffer automatically determines the type of the destination buffer and acts accordingly. If you had stored ansi data in the file or stream, you need to read it into an ansi buffer, not a unicode buffer. If you pass a unicode buffer to ReadBuffer it is going to think you stored unicode data and will read it out as such.

Under the covers CodeGear has changed the TBookmark type from a pointer to TBytes. This will not affect most applications that simply use the GetBookmark and FreeBookmark TDataSet methods. If, however, you are doing anything "goofy" with the pointer you get from GetBookmark, beware. Many of our automated tests needed to be modified to consume the bookmarks in a more generic/standard fashion.

Advantage-Specific Issues

If using the TAdsDictionary component, many of the parameters for functions like TAdsDictionary.AdsDDSetTableProperty are sent via Pointers. If in the past you were casting string variables to PChar, these casts will need to change to send an ansi character buffer now. For example:

AdsDictionary1.SetTableProperty( 'table1', ADS_DD_TABLE_DEFAULT_INDEX, pchar(strDefaultIndex), 8, 0, '' );

Would need to change to:

AdsDictionary1.SetTableProperty( 'table1', ADS_DD_TABLE_DEFAULT_INDEX, PAceChar( AnsiString( strDefaultIndex ) ), 8, 0, '' );

If you use any API’s that return data into char buffers (in Advantage this means ACE API’s), those buffers need to be redeclared as arrays of bytes or an array of AnsiChar, not an array of char (as char is now a unicode char). If using the ACE API a new type called AceChar has been defined and can be used.

Windows API’s

If you are calling Windows API’s and sending in buffers, you may have been using the sizeof function when telling the API how long your buffer is. Those calls need to be changed to use the Length function, as the Windows widechar API’s require the number of characters, not the number of bytes. For example:

must be changed to:

If you were not explicitly calling the “A” functions (for example, SearchPathA), then when compiled in Delphi 2009 your application will now be using the equivalent “W” function (SearchPathW). In many cases this works well, as the Delphi buffers you were passing are now unicode buffers. There are still situations where manual inspection is necessary, however. For example, any API’s that accept a structure will need to be inspected to verify you are populating any string fields in the structure with the expected string type.

Friday, September 12, 2008

Delphi 2009 Conditional Defines

I wasn't going to blog about this, but for the past few months every time I have used Delphi 2009 I have been tricked by this and it's very frustrating.

When you open the project options and head to the familiar "Directories and Conditionals" section be careful. Note it is now under the "Resource Compiler" tree item and does not have anything to do with the old directories and conditionals setting:



You probably don't want to be there. You are likely looking for the old compiler conditionals, which are now hidden under the main "Delphi Compiler" tree item in a property called "Conditional defines":



Thursday, September 4, 2008

Provide Feedback and Vote on Future Features

We have launched a new feedback web page at http://feedback.AdvantageDatabase.com

The site allows you to submit feedback, vote on existing feedback, track the progress of your suggestions and generally provides a single interface for providing feedback to the Advantage team.

We have seeded the site with some of our choices for future features. Check out the site and vote on these features or submit your own new feature requests. If you have submitted feedback in the past but don't see it on the site yet don't worry, we still have more data to upload. Don't hesitate to resubmit your ideas to this new site if you'd like to see them added right away. We will take care of weeding out the duplicates.

Wednesday, September 3, 2008

Updating Static Cursors - Part 2

As I promised in my original post about updating static cursors, here is a screencast demonstrating the technique(9 minute screencast).

I also made a few minor changes to the view and trigger code in the original post, as I noticed I updated the wrong table inside the trigger in the original post. :)

I'll apologize to those with smaller screens in advance for the large desktop I used when recording the screencast. In the future I'll switch to 800x600 before recording these.

Do you have a favorite Help authoring software?

I'm in the process of evaluating help authoring software and was curious if any readers have a favorite I have not yet considered.

My core requirements are as follows:
  • Imports either WinHelp or HTML Help 1.x file format
  • Files are stored uncompressed as either XML or HTML
  • Compiles to HTML Help 1.x, Help 2 (Visual Studio), and WebHelp
  • Supports automated builds (command line compilations)
So far I have evaluated:
  • EC Software's Help and Manual
  • Innovasys HelpStudio
  • MadCap Software's Flare
  • Oxygen XML Editor
  • Just Systems XMetaL
And the contenders are:
  • EC Software's Help and Manual
  • Innovasys HelpStudio
  • MadCap Software's Flare (no Visual Studio help support though...)
If you have experience with any of these packages or war stories to tell I'd love to hear them. You can leave any feedback in the comments section of this post. Thanks!

Wednesday, August 20, 2008

Delphi Insert vs Append Performance

For well over 10 years I have thought that the only difference between the Delphi TDataSet.Insert method and TDataSet.Append was where the record appeared in data aware controls before it was actually posted.

For most cases that is true. In fact if you look at the Delphi documentation you will find the method descriptions are almost identical and they both include the following:

"After the new record is applied back to the database server, its physical location is database-specific. For indexed tables, the index is updated with the new record information."

In some user interface situations I can certainly see a use for the Insert method, but I almost always use the Append method, especially when writing batch operations that insert a number of records.

There is a subtle implementation difference in the Delphi TDataSet code (the Delphi code, not the Advantage descendant code) that can have a rather dramatic affect on performance. A call to Append sets an EOF flag in the internal record buffer. A call to Insert does not. The internal EOF flag results in a call to TDataSet.Last before EVERY Post call. A call to Last results in an extra trip to the server. Not a big deal for a single insert, but in a batch insert operation from the client this can have a huge impact on performance, especially if the data is located on a remote machine.

In a very quick test I appended 10,000 records to a table using an Advantage remote server connection to a different PC (not the Advantage server on my test PC, I wanted the network traffic). I ran the test once using Append and once using Insert. I'm sure you can see where I'm going with this... Append took almost twice as long as Insert!

Append: 8359 ms
Insert: 4409 ms

Batch inserts are best performed via SQL, but I know lots of applications process them a row at a time using Append. If you have code that does this it might be worth your time to try testing with Insert instead.

Tuesday, August 5, 2008

Updating Static Cursors

Ever been happy because you normalized some tables, wrote a query that was way faster than it was before, only to find out later you now have to re-architect a portion of your application? Yep, your fancy join is way faster, but that simple grid interface that used to be a live cursor is now static, and users can't just update it directly anymore.

In version 7.1 (2004) support for triggers on views was added to Advantage. I'd like to take a quick look at how this feature allows you to update static cursors.

Consider the following grid interface to a bug tracking table:



Lets say you modified your schema and added a users table with additional information about each user; location and department. You also modified tables that reference users to store the userID now instead of the username. As a result, instead of a simple:


SELECT bugid, description, owner FROM bugs;

Your query now looks something like this:


SELECT b.bugid, b.description, u.name as "owner" 
FROM bugs b, users u
WHERE b.ownerID = u.id;

All is well, except users can no longer directly update cursor rows, as the result set consists of a join of two tables.

An elegant solution is to first create a view using the query above:


CREATE VIEW BugPreview AS
SELECT b.bugid, b.description, u.name
FROM bugs b, users u
WHERE b.ownerID = u.id


Next, create an INSTEAD OF UPDATE trigger on the view, handling the update yourself using your knowledge of the relationship between the two tables:


CREATE TRIGGER UpdateBugPreview
  ON BugPreview
  INSTEAD OF 
  UPDATE 
  BEGIN 
  declare @new cursor as select * from __new;

  open @new;
  fetch @new;

  try
    -- update fields in the bugs table
    update bugs set description = @new.description,
      ownerID = (select id from users where name = @new.name)
    where bugid = @new.bugid;
  finally
    close @new;
  end try;
  END;


Cursors using the BugPreview view can now be updated directly. Restoring live cursor functionality to a dataset that is really composed of two physical tables.

This technique allows you to normalize data and increase query performance while still maintaining simple "single table" front-end interfaces to your datasets. Adding INSTEAD OF DELETE and INSTEAD OF INSERT triggers in a similar fashion can provide DELETE and INSERT functionality on this static cursor as well.

Note there is some additional maintenance if you use this technique. You now need to be aware of these triggers if you modify the list of columns displayed by the view. Any changes to the column list will require some slight modifications to the triggers. This example also requires a unique user name, and doesn't handle the name not existing yet (you could catch this and insert a record, however).

In my next post I will include a screencast showing an implementation of this solution from start to finish.

Monday, August 4, 2008

API Documentation Searches with Firefox

I stumbled on something today that might be of use if you use Firefox and were not already aware of it.

I accidentally typed a Windows API name into the firefox destination edit box, instead of into my google toolbar edit box.

Normally I would expect the browser to either try to load "http://www.CreateFile.com" or to perform a web search for "CreateFile". I was surprised when I was immediately greeted with the MSDN documentation for the CreateFile API. No intermediate search and an extra click on the search results, which is my usual routine.

That was pretty handy. So I tried a Delphi function (GetMem), and sure enough I was greeted with GetMem documentation from www.delphibasics.co.uk.

This seems to work kind of like the Google "I feel lucky" search that pulls up the page with the highest rank. It's rather handy and much faster than waiting for the Microsoft document explorer to load and display help pages.

Tuesday, July 15, 2008

ARC: Did You Know #4

Did you know you can execute a portion of an SQL script buffer in the ARC SQL Utility?

Many new users are not aware that you can can highlight a specific section of a script and just execute that selection.



The technique can also be utilized with the SQL debugger. You can set a breakpoint and select a portion of a script, or you can select a portion of a script and simply hit F10 to start debugging just that selection. Often times this is not terribly useful as you would almost always need the variable declarations at the top of the script in order for the script to run. However, this can be a useful technique if you have a large upgrade script that creates procedures/triggers/functions and you want to debug or execute just a portion of the those new objects without pulling them out into a temporary script file.

Thursday, July 10, 2008

Advantage Training

Warning:Marketing content below! I strive to provide mostly technical content in this blog, but this post is purely trying to let you know about a training opportunity. It's technical training, so I hope you will let this slide... :)

I can tell from my web and RSS stats that I have picked up quite a few Visual FoxPro readers. The Getting Started with Visual Foxpro and Advantage post remains one of my most popular. If you are interested in learning more about Advantage, our Advantage Technical Summit would be the absolute best place to start. We provide the training and meals, you only have to pay for airfare and hotel.

Our trainings in Boise not only provide an opportunity to attend classes, but you also get to mingle with other Advantage users. Listening to their success stories, tips and tricks can often prove invaluable. In addition, you have full access to our support and engineering staff who are always happy to answer questions, sit down and debug code, help write some new code, etc.

The next training is September 10th and 11th. Check out the schedule here, and register here. We can always schedule one-on-one sessions if you have specific questions or problems.

If you can't attend this training be sure to stop by our booth at Southwest Fox 2008 October 16-19.

Thursday, July 3, 2008

Decoding winmail.dat files

I don't have a Microsoft e-mail client, and every once in a while I get an e-mail with a single winmail.dat attachment.

I use a simple utility I thought some readers might find useful called WMDecode to extract the attachments. It's simple and has worked on every winmail.dat file I have used it on.

There is a free version with a timeout, or you can buy a perpetual license for 10 dollars.

Tuesday, July 1, 2008

Nitpicks Matter

I mentioned in a previous post that nitpicks matter. This is the main premise of Joel Spolsky's book User Interface Design for Programmers, which while simplistic in the realm of UI design, is very fun and easy to read and significantly influenced my redesign of the Advantage Data Architect (ARC) GUI in version 8. Every little thing that might not seem significant adds up to a poor user experience. Sometimes you can't quite peg what is wrong. It's nothing major, nothing worth spending the time reporting to the software vendor, but it's just annoying.

The irony here is that often the little bugs that drive you crazy are the easiest to fix. One or two lines of code and that bug that makes you curse every time you run into it is gone. You are happy as an end user, and we are happy as developers because we were able to quickly and easily address something that was bothering you.

While adding some stored procedures and triggers to an internal utility that uses Advantage recently, two things became very apparent; 1) I had not done enough usability testing with the SQL debugger and 2) Users are not sending us the nitpicks that are annoying them.

#1 is entirely my fault, and there isn't much I can do now except fix up my mistakes and do some more usability testing. The rewarding part is as mentioned above, many of these fixes are just small tweaks in the user interface behavior.

#2 became apparent as I ran in to at least 4 issues that drove me crazy. The cursor left the editor buffer after every execution (which one user did report, thanks!) and ARC would not position the cursor at an error location when saving a trigger with errors. I had a less than stellar experience. A few days later a partner called me to explain a similar experience they had encountered. I'm convinced many others have run into the same issues but do not have time or do not know how to report the issues. Many also probably thought the bugs were nitpicks and did not warrant nagging us with an e-mail. Nag us, please!

How do you report these nitpicks? In the future we will put a simple interface into ARC that allows you to report whatever you want with no required input fields. Until then, you can always send email to advantage@ianywhere.com and put "attn: JD" in the subject field. Those e-mails will get to me and I will log them in our bug tracking utility myself. Often a small video is much easier than expressing re-creation steps in an e-mail, so feel free to use utilities like Jing to send URLs of short video clips if that is easier.

ARC: Did You Know #3

Did you know you can "bootstrap" a debug session in ARC? You don't have to have any breakpoints set to start debugging. Simply clicking "step over" (F10) or "step in" (F11) will start a debug session and stop on the first line in the script.

This is a handy way to get ARC to quickly jump to the source code of a procedure or trigger you want to edit. For example, in this video I use this functionality to quickly jump into a procedure, edit it, and then verify my changes.

In the future the SQL Utility in ARC will hopefully have the ability to directly open procedures, triggers, and functions via the open command, but for now this is a convenient shortcut (as opposed to having to open the procedure properties dialog).

One thing to keep in mind is that once you have canceled the session and are inside the procedure buffer in the editor, you need to click "step over", "step in", or "start debug session" (Shift-F5) to start a new debug session, not "execute" (F5). Clicking "execute" will just attempt to execute the current buffer because you bootstrapped the initial debug session and there are no breakpoints set (breakpoints are how ARC usually decides if you want a debug session or if you just want to execute the script you are currently viewing). Hitting F10 or F11 will use the base script you started with as the driver for the debug session, which is likely what you intended.

Another alternative is to set a new breakpoint once you have the procedure buffer loaded in the editor. At that point the "execute" command will always start a debug session, and will always use your driver/base script to call the procedure.

Wednesday, May 28, 2008

Automatic Application Updates

By now all of you 9.0 users should have received your first automatic update of ARC - Advantage Data Architect v9.0.0.1. I thought I would take a few posts and discuss how we implemented our automatic updates.




Automatic updates are very common, and I would guess the majority of you have already implemented them in one way or another. They make a lot of sense for vertical market applications, but I have always questioned their validity with regards to a database server. Not many developers want their end user's getting updates to an application/database combination that has already been run through a QA process.

That being said, ARC is a tool that is not directly tied to your application and it makes sense to keep your installation up to date with the latest bug fixes. Hence our decision to finally add automatic updates to ARC.

We started with the following requirements:
  • Don't interrupt the user's workflow
  • Automatically download the correct file
  • Allow users to manually check for updates
  • Allow users to postpone or skip an update
  • Provide a list of bug fixes

Don't interrupt the user's workflow

I can't stand applications that ask me if I want to update the minute I open the application (ala Firefox). Did I open the application just to update it? Not likely. ARC prompts when you are closing the application. Presumably you have already accomplished what you set out to do, and we did not sidetrack you or worse yet make you completely forget what you were about to do.


Automatically download the correct file

Another pet peeve of mine are applications that tell you an update is available, but then just send you to a website rather than downloading the file automatically. The icing on the cake are applications that send you to a web site and then make you think. Many send you to a complicated download site that requires you know what version you currently have, provide account login credentials, etc. ARC will automatically download the correct installation and start it for you.


Allow users to manually check for updates

This one is pretty much a standard. The "Check for Updates" menu option in the help menu.


Allow users to postpone or skip an update

Postponing is fairly standard. We wanted to also provide the option to skip a version altogether. Perhaps what you have is working great and you never want to hear about 9.0.0.1 being available ever again. We will still nag you when 9.0.0.2 is released, but not until then.


Provide a list of bug fixes

It's nice to know what you are getting yourself in to. Does this version just fix 2 bugs, neither of which you have never run into? If so, you can skip this version.


In my next post we will start taking a look at the code behind ARC's automatic update functionality. July 01 2008 Update:I have not yet revisited the code and written a post about it, but I do still plan on it at some point if there is enough interest. If you are interested post a comment which might kick my lazy butt in gear.


The How-To Geek

I need to post a link to the How-To Geek blog.

If you don't already subscribe I highly recommend this feed. I've learned about lots of cool utilities and shortcuts that increase my productivity every day.

The most recent post that caught my attention was this post about Jing. From the Jing website: "The concept of Jing is the always-ready program that instantly captures and shares images and video…from your computer to anywhere."

I used it for a few minutes today and it was very simple. Ideal for sending a quick clip of a bug you are seeing but don't want to explain to your peers via a five paragraph e-mail... :)

Chris's Blog

For those of you who do not monitor the Advantage newsgroups, Chris (one of our Sales Engineers/System Consultants) announced a few weeks back he started blogging about Advantage.

I made a note at that time to post a link to his blog, but just now got around to cleaning up my inbox at home and remembering the post. Looks like he's been busy writing since then, so be sure to check it out at http://advantageevangelist.blogspot.com/

Wednesday, May 14, 2008

Use Visual FoxPro and Visual Studio - and Share Your Existing DBFs

I put together a short screencast (10 minutes) showing how Advantage can share Visual FoxPro DBF tables with Visual Studio. This provides a variety of alternatives when migrating existing applications to client/server.

In the screencast I demo a Visual FoxPro application using Advantage sharing data with a Visual Studio application using Advantage (not earth shattering news, I know). I then move on to show something I do think is worth some consideration; A Visual FoxPro application using DBF tables WITHOUT Advantage (standard DBF access, no ODBC or OLE DB) sharing data with a new application written using the Advantage .NET Data Provider.

The second portion of the screencast calls out a major benefit to Visual FoxPro developers. You can continue to support your existing applications while doing new development using Visual Studio and all applications can share the exact same database.

If you find that interesting, be certain to also check out my previous post explaining some of the ISAM-specific functionality Advantage brings to .NET development via our direct table access.

While I call out Visual Studio in this screencast, the same approach can be taken with any of the development environments Advantage supports. Delphi, Visual Objects, JDBC, PHP, etc. Any of the clients listed on the downloads page of the DevZone can share data with an existing Visual FoxPro application.

View the screencast now

Download the screencast (19MB, open fox_sharing.html after unzipping)

Wednesday, April 23, 2008

Faster GUID Keys

In the Advantage 7.x days, there was no built in mechanism to generate a GUID. Many users ended up writing INSTEAD OF INSERT triggers as DLL's or .NET assemblies in order to automatically add a GUID key to a record.

While that method works well, and was the only option back then, there is a lot of extra cost for a relatively simple task. The server has to populate the __new and __old records, call into the trigger (if it is an assembly that call is marshalled, adding more overhead), cleanup, etc. Plus you now have to maintain and distribute external DLL or assembly files.

Advantage 8.1 introduced a new scalar function and expression engine function called NEWIDSTRING. This function can be used to generate a GUID using a variety of formatting options. Because it is not only an SQL scalar function, but also an expression engine function, it can be used as a default field value as shown in the following image.



In this example I am using the "file" encoding, which results in a smaller key value (22 bytes). Note this is a case sensitive character field as opposed to a cicharacter field. As mentioned in the help file, this particular type of encoding (Base64) is case sensitive. A variety of encoding options are available and described in the help file.

Not only is the default field value much faster, but it also provides the default field value immediately upon record insertion, as opposed to waiting until the new record is posted. To see this behavior in action, configure a default field value for a table, then open that table in ARC and add a new record. Before you even post you will see the GUID value populated in the pending record buffer.

Tuesday, March 18, 2008

Advantage 9 Ships

For those of you who don't yet subscribe to the Advantage Announcements Feed, Advantage 9 has shipped.
View more details here, including benefits, whitepapers, datasheets, etc.

Monday, March 10, 2008

Advantage SQL Debugger

With the release of Advantage 9 right around the corner, I thought I would post a screencast showing the new visual SQL debugger in action.

The screencast (11 minutes) can be found here.

One side note is there is a bug in the 9.0 release that I will have to address. If you click on any of the "step" buttons with the mouse, then don't move the mouse at all and click again, the menu is not registering the click event. If you notice in the video I move the mouse just slightly between clicks to make it work. I normally just use the keyboard shortcuts anyway, which is why this went unnoticed until it was too late to fix it for the release. It will be addressed in the first 9.0 update, however. And since ARC 9.0 has auto-update functionality, the new fix will get pushed out to everyone, which should be cool.

I'll be posting screencasts of more of the 9.0 features as time permits, but the SQL debugger "demos well", so it was an easy first choice. :)

Thursday, February 21, 2008

Getting Started With Visual FoxPro and Advantage

I've put together a short (11 minute) screencast for FoxPro users who are curious about Advantage, but don't know where to start.

It is always easier (at least for myself) to watch someone else do something than it is to search through mounds of documentation that is unfortunately often written for readers who already have some background with the product.

Let me know if you have other questions or topics you would like to see in a future screencast.

View the FoxPro Screencast Now

Download the screencast (26MB, open fox1.html after unzipping)

For more information on Advantage you can visit Getting Started for Visual FoxPro Developers, e-mail us at AdvantageInfo@iAnywhere.com, or attend an Advantage Training.

Wednesday, February 13, 2008

Direct Table Access in ADO.NET

Dealing with the disconnected recordset paradigm in ADO.NET can often present hurdles, especially if you come from an ISAM/navigational background and are used to dealing with direct table opens.

The Advantage .NET Data Provider provides a class that is meant for all of you ISAM lovers out there (we know you're out there), the AdsExtendedReader class.

While this class is a descendant of the AdsDataReader class, its functionality is not limited to reading data. The class can be used to set scopes, perform index seeks, lock specific records, update records, etc. It is basically a class that exposes much of the Advantage ISAM functionality to ADO.NET users, and is part of what makes Advantage unique.

One powerful use of this class is to provide lookup combo boxes that do not require the overhead of multiple SQL statement executions. An AdsExtendedReader can be used to quickly scope table contents based on the current contents of the combo box. Let's take a look at some example code that does just that.

I have built an example application with two controls; a button and a combo box. The button is used to open a table and set an active index:
    private void button1_Click( object sender, EventArgs e )
    {
      // connect
      adsConnection1.Open();
      
      // open the table, adsCommand1 is just a "SELECT *" statement
      oExtendedReader = adsCommand1.ExecuteExtendedReader();
      
      // Set the active index
      oExtendedReader.ActiveIndex = "lastname";
    }

The combo box has one event defined; the TextUpdate event. This event fires whenever the text in the combo box has been modified. As the user types, it sets a scope (sometimes also called a range) on the table and fills the drop down with values from the table that pass the scope conditions.

Setting a scope results in a simple seek operation on the table. No files are closed and reopened. No SQL engine is involved in re-executing a query. An index seek is the fastest way for Advantage to locate a record.
    private void comboBox1_TextUpdate( object sender, EventArgs e )
    {
      Object[] oKeys = new Object[1] {comboBox1.Text};
      int i = 5;
      int iCursor;
      
      // Here's the fun part, a connected dataset scope inside .net
      // This table could have a million records, but they are not all read
      // into an in-memory table on the client. We are using an index on
      // the server for a very fast and efficient scope.
      oExtendedReader.SetRange( oKeys, oKeys );
      
      // Save current cursor position
      iCursor = comboBox1.SelectionStart;
      
      // Clear the combobox list and add a few items from the reader's current
      // position.
      comboBox1.Items.Clear();    
      oExtendedReader.Read();
      while ( ( !oExtendedReader.EOF ) && ( i > 0 ) )
        {
        comboBox1.Items.Add( oExtendedReader["lastname"].ToString() );
        i--;
        oExtendedReader.Read();
        }
      
      // If there are some matches, show the drop down
      if ( comboBox1.Items.Count > 0 )  
        comboBox1.DroppedDown = true;  
        
      // Set the cursor back to where it was
      comboBox1.SelectionStart = iCursor;             
    }

The video below shows the example application in action.

video

What are you using the AdsExtendedReader for? Any functionality you would like to see added?

Thursday, February 7, 2008

Delphi: Did You Know #1

Did you know you can bring up the ARC table designer directly in the Delphi IDE?

No need to open ARC if you want to add fields, modify indexes, etc. Just right-click on the TAdsTable component and select "ALTER/Restructure Table...".


You will get the exact same table designer that is in ARC, only without the trouble of disrupting your workflow and starting ARC, connecting to the database and opening the table properties.


Make your structure change, close the dialog, and get back to work.

Sunday, February 3, 2008

ARC: Did You Know #2

Did you know you can finally drag and drop files onto an existing ARC instance and it will open them?

The main reason I am posting this is for those of you who tried years ago, learned ARC could not open the files, and gave up trying. The drop support was fixed in a recent update to ARC 8.1

You can now drag a table or dictionary instance from Windows Explorer and drop in on ARC and it will be opened.

Added Bonus: A sidebar of sorts on reporting bugs/enhancement requests

I'm going to attempt a preliminary strike against the "This is trivial. Why didn't you do this earlier" comments: We love the keyboard. By "we" I mean the Advantage R&D team. We are not big mouse users, hence the reason you have always been able to open a table or dictionary from the command line. This doesn't mean ARC shouldn't be able to open a table by dragging it, it just means we don't use ARC that way, so we didn't notice the lack of support.

I'm sure many of you have tried this in the past, noticed it didn't work, mumbled something under your breath, and continued on. I think this is very common in the software industry and I know I do it all the time. I don't have time to interrupt my work flow, find some way to submit a request to the vendor, then post what feels like a "trivial" problem that I don't think they will take seriously. (I'll write an entire post in the future about how important those "trivial" things that drive you crazy truly are.)

I'm hoping in the future we can add a simple dialog to the help menu in ARC that lets users submit a bug or feature request. You wouldn't have to search a web site or post to our newsgroups. We wouldn't require any information other than your request and your e-mail address if you feel like providing it so we can clarify your request if necessary.

My favorite third party website is provided by Developer Express. I love their support center. Before submitting a question or bug report, I am forced to do a quick search on existing issues. If I don't find a relevant post, I am allowed to post my question. I can keep my question private, or by default have then entire conversation with their support staff public. This automatically results in a superior knowledge base for other users to search. This is a bit more ambitious than what I am currently proposing, but I'd like to see us get to this sort of system at some point.

Anyway, hope you enjoyed this "short" Did You Know...

My FoxShow Interview

A recent interview I did with Andrew MacNeill has been posted on his FoxShow podcast #49. Andrew and others in the FoxPro community have been fairly interested in Advantage and our upcoming Visual FoxPro support. Thanks again to Andrew for the interview. It's much easier to explain a product in person like this as opposed to a few marketing bullets in a print ad.

Wednesday, January 30, 2008

ARC: Did You Know #1

Inspired by Sara Ford’s “Did You Know” series on Visual Studio, I thought it would be fun do do the same with Advantage. The majority of these will focus on the Advantage Data Architect (ARC), but I will sometimes cover other parts of the product as well.

Did you know that ARC can automatically generate INSERT statements for all rows in a table? The functionality is hidden in the "Tables To Code” form right now. In the future I hope to expose it via the existing export functionality or the automatic SQL generation you can access by right-clicking on any object in the Connection Repository.

For now, if you want a batch of INSERT statements generated for a table click on the Tools -> “Export Table Structures as Code” menu item.

Click on “Add Table(s)” and select one or more tables.

Choose SQL as the export type. When you do this, the “Include Existing Data” checkbox will become active. Check this box to generate not only the CREATE TABLE statement, but also an INSERT statement for every row in the table.

If the table is large, you can specify an output file, which will be more efficient than just having ARC build the output in memory and display it on the screen.

Tuesday, January 29, 2008

A Database Trigger That Maintains an RSS Feed

Since my opening post was about how much I’m addicted to RSS, I thought it would be fitting if my first Advantage-specific post was related. In this post I will talk a bit about a trigger I wrote that updates an RSS feed automatically. At the end I will include a link to a zip file with the trigger source.


This trigger is used in a bug tracking database. R&D engineers and support engineers subscribe to the feed to stay up to date with known bugs in the product. Readers obviously won’t memorize every entry, but this helps facilitate that, “Hey, I think I remember a bug similar to this.” moment that can often be helpful. It can also be used to replace e-mail notifications when new bugs are entered into a tracking system. RSS is a much friendlier way to deliver these notifications, as it can be read at the consumer’s leisure, as opposed to interrupting them or littering their inbox as e-mail notifications would.


Triggers can be fun to write because they can often add useful functionality to an application with zero changes to the actual application itself. This particular trigger is defined as an AFTER INSERT trigger on the base bug table. It takes a few details from the new row, including the bug ID and description, and writes them to an xml file.


This particular example is a trigger DLL written in Delphi. The first order of business is to extract the information we want to include in the feed from the __new table.
oQuery.SQL.Text := 'SELECT * FROM __new';
oQuery.Open;
strID := oQuery.FieldByName( 'bugid' ).AsString;
strTitle := oQuery.FieldByName( 'title' ).AsString;
strDesc := 'Bug #' + strID + #13 + #10 + #13 + #10 +
      oQuery.FieldByName( 'Description' ).AsString +
      #13 + #10 + #13 + #10;
oQuery.Close;

The next block of code reads a few settings from a settings table in the database. The main purpose of the settings table is to avoid hard coding the values inside the trigger.

The settings table looks something like this:

PropertyValue
rss_feedc:\path\to\myfeed.xml
rss_feed_itemtemplatec:\path\to\myfeedtemplate.xml
rss_feed_linkhttp://myserver/somesite/buginfo.php?BugID=

The first two rows include paths to the feed file (myfeed.xml) and the feed template file (myfeedtemplate.xml). The feed template file contains the xml shell to generate a single new feed item. The final property, rss_feed_link, is used to embed a link readers can click on that will take them to a web page where they can view more details about the particular feed item.


And the code that reads the settings table looks like this:

oQuery.SQL.Text := 'select * from settings';
oQuery.Open;
if not oQuery.Locate( 'property', 'rss_feed', [loCaseInsensitive] ) then
// no feed configured, just exit
exit;
strFeedFile := oQuery.FieldbyName('value').AsString;
if not oQuery.Locate( 'property', 'rss_feed_link', [loCaseInsensitive] ) then
// no feed link configured, just exit
exit;
strFeedLink := oQuery.FieldbyName('value').AsString + strID;
if not oQuery.Locate( 'property', 'rss_feed_item_template', [loCaseInsensitive] ) then
// no feed link configured, just exit
exit;
strFeedItem := oQuery.FieldbyName('value').AsString;
oQuery.Close;


Finally, a utility function is called to actually add a new feed item to the file.

AddFeedItem( strFeedItem,
         strFeedFile,
         strTitle,
         strFeedLink,
         strDesc );

The base feed file includes an identifier that can easily be located, in this case it is the comment -- next item here --. The AddFeedItem function constructs a valid feed item and replaces -- next item here -- with the new feed item template, which also includes the comment at the end, facilitating the next insertion sometime in the future.


Verifying Your Feed

I use the following site to verify a feed after testing. This is particularly important when generating the xml in code, as we aren’t using any third party libraries, and there is certainly the potential of generating some malformed xml:

http://www.feedvalidator.org/


Performance Note

It should be noted that this trigger is not suited for a table that frequently undergoes a lot of concurrent insert operations, or batches of inserts. It streams the feed file in each time it is executed, and the reading and writing of the feed file has to be synchronous. The last thing you would want is 100 users waiting for synchronous access to the feed file.


Exercises Left to the Reader

This trigger was written in haste with a “good enough for government work” mentality. In other words, I was fairly lazy but this gets the job done. As such, there are certainly some issues that would need to be addressed if you wanted it to be bullet proof.

One that comes to mind off the top of my head is if absolutely every new row needs to be in the feed, you would need to address the code that bails out if it can’t get a deny write lock on the feed file. In its current state, the trigger just gives up after a few tries and exits.

You may also want to limit the number if items you put in the feed. When you reach a specific number of feeds you may want to delete an old item for every new item you add.


The Source Code

The trigger source code, a base feed file, and a feed item template can be found here.

Wednesday, January 23, 2008

Book Review – The Pirate’s Dilemma by Matt Mason

I attended the Business of Software Conference last year and had the pleasure of watching Matt Mason give a presentation. You can view it here. The presentation, while not directly related to software piracy (although he does his best to incorporate it), is thought provoking and very interesting. Matt is a talented individual.

When his book The Pirate’s Dilemma came out I bought a copy. The book is full of examples of piracy from punk rock to patent law, graffiti, the recording industry, pharmaceuticals and more.

I felt like there was more story telling going on than actually linking the stories together to make a point, but the truth is the stories were all very entertaining and most could stand on their own quite well. Still, I felt a bit lost as I jumped from story to story, wondering if things were going to come together in the final chapter.

The final chapter didn't really provide the "Ah Ha!" wrap-up I was hoping for, but the book is a quick read, has a lot of fun stories, and helps you view all forms of piracy from a different perspective.

On my book review scale (1=skip it, 2=bookmobile, 3=buy it) I give this book a 2.

The next book on my night stand is from two authors who obviously couldn’t agree on a book title: Hard Facts, Dangerous Half-Truths & Total Nonsense, Profiting from Evidence-Based Management

On Standby: The Illusions of Entrepreneurship

Tuesday, January 22, 2008

Why Blogs are not as Lame as they Sound

I’ll never forget the first time I heard the term blog. How lame. An online journal of some sort? Nerds pouring out their feelings and daily activities to strangers on the web; not my scene. I truly didn’t get it, and didn’t spend any time investigating why I would be interested in blogs.

That must have been some time around 2003. About a year later I was reading the newsgroup of a company we work closely with and was shocked to see one of the engineers mention an upcoming change that was going to require quite a bit of work on my part in order for Advantage to be compliant. I jumped into the conversation and it went something like this:

“How can you be changing this and you didn’t tell your partners until just now? No e-mail, no communication of any sort? This is ridiculous.”

“I’ve been talking about this change in my blog for months”, was the reply.

“I’m very busy. I don’t have time to go and check the blog or website of every vendor I interact with, that would be a full time job in and of itself.”

“Use an RSS reader”

This is where I had to stop and do some research. I knew syndication was a popular way to get various news items to web portals, but I had no idea what an RSS reader or an RSS aggregator was.

Of course, I would find out an RSS reader is the cornerstone to what makes blogs tick, and actually downright addictive. The technical blogs I now subscribe to via RSS feeds are my number one source of technical information. With printed magazines going the way of the buffalo, technical blogs are an excellent way to learn new technologies, stay up to date with companies and products you are interested in and discover tips and tricks you would otherwise never be exposed to. RSS feeds are not limited to blogs either. Practically any published content can benefit from having an RSS feed, and tons of web sites provide feeds for news content, updates to their downloads sections, etc.

If you already use an RSS reader, this post is of little interest. I wouldn’t be writing it except for the fact that I run into so many people that still don’t have a reader set up, and are completely missing out on what I think is an amazing wealth of knowledge and something that truly helps developers be more productive.

In fact, I wouldn’t have started my own blob if not for the fact that I have learned so much from others that I feel the need to try to contribute and start that same sort of knowledge sharing in the domain I work in and with the community I interface with (that’s you, whoever you are).

If you are new to technical blogs and don’t know where to start it is relatively easy. You subscribe to one or two feeds that interest you, and eventually those authors will reference someone who interests them. You follow that link and subscribe to the referenced author for a while and see if you like his/her posts. If so, you stay subscribed and eventually that referenced author will reference someone else. Rinse, lather, and repeat. Soon you will be monitoring 100 feeds, but the beauty is you will spend no time whatsoever hunting for content, it will all be brought to you and presented only when there are new items to read. Brilliant.

There are a ton of readers to choose from. I use Google Reader because I like to view my feeds from home, from work, and from my phone. I’ve used Omea Reader in the past, and many on my team use RSS Bandit. A web search for “RSS Reader” should turn up lots of choices.

In the future I’ll post a few of my favorite feeds to get you started.