11-21-2001, 07:39 PM
please someone tell me the implications of this design (aside of course from hugging hardware resources)
i created global recordsets, which initializes and fetches records upon app start-up.
if i need anyone of this recordsets, i just create a clone.
is this ok? i do know some basic facts bout cloning a recordset. im sure u guys could tell me more..
11-21-2001, 07:54 PM
why clone them?
There's no point populating them at startup in case the source data changes.
Just declare the variables Public (if you really need to use them in different modules) and populate/close them when you need them..
11-21-2001, 07:56 PM
Never done clones before, but I have always been under the impression that you shouldn't open recordsets and database connections globally. The main reasons for this are as you say....overhead. That said there are other reasons as well that I can think of. 1. Database performance. At least from othe users standpoint. If you have the recordset open and the database connection open for the life of your app then you can run into performance problems (not to mention record lock problems.) 2. Update problems. If someone else changes a record while your app is running then using your clone method you will not see the changes that are being made.
Well all that being said in the end you are the one who has to decide whether these things will affect your app or not. If it is a standalone app then they probably won't affect you too much (aside from the updated info bit.)
11-21-2001, 09:46 PM
my real intention really is to minimize database reads.
i have several rs that takes an awfully lot of time to open. i guess i really have to check how my app perform using cloned rs.
thanks a lot guys!
11-21-2001, 09:55 PM
do you really need huge recordsets? Can't you use more specific SQL statements?..
11-21-2001, 10:08 PM
I'll often open a connection at app startup & keep it open during the life of the app. Recordsets I usually create/open/close/release as needed though.
I'm normally connecting to Oracle, which takes 3 to 5 seconds to get that intial connection established. While this can be somewhat expensive in the number of Oracle sessions, the performance gain in not having to wait for the connection is generally worth it. I do usually add code to close the app after 30 minutes or so of inactivity.
11-21-2001, 10:18 PM
let me just part from the topic for a moment and say what a class picture you have loquin; it's definitely the funniest one I've seen on this forum! Very good - go "The Brain" Homer!! LOL. images/icons/laugh.gif
11-21-2001, 11:18 PM
yeah, i got some huge rs.
11-22-2001, 10:04 AM
I have often thought of doing what you suggest there. Never got around to testing it to see what kind of performance boost that would give my apps. Perhaps it is time to test it out. I really can't see any reason for a problem with the database connection remaining open for the life of the app. It is the recordsets that would concern me the most.
12-06-2001, 04:48 AM
I got a little method that i think reduces network traffic.
Declare your connection ojects globally (or global to a form, whichever is best), and then whenever i want to get info from a database, i open a temporary (local) recordset as read-only forward only, open it and then immediatly read all the values into a user-defined type array (ie. type - end type), then close the recordset. I then do all updates by using SQL and a command object (ie. using INSERT INTO and UPDATE ...WHERE). The problem with this approach is that the underlying recordset could change between reading in the values and writing back the updates with the SQL. In my case, i managed to design some error handlers which took care of this. The advantage of this, however is that the forward only, read only recordset is very fast in ADO, and you only have the recordset open for the amount of time you need to read all of the data.Hope this might give you a new angle
12-06-2001, 07:27 AM
The method rcbasie suggests is very good in general for dealing with data
on a different database server. I use a variation on it that keeps me from
having to stick the values into arrays, the disconnected recordset. It is
done by using the adLockBatchOptimistic LockType and then setting the
ActiveConnection property of the Recordset to Nothing once the recordset
has been returned. For update, I prefer passing parameters to stored procs
rather than generating the SQLs directly.
12-06-2001, 09:22 AM
The problem with disconnected recordsets is that they are client side only. Ever tried to pull a half million records or so client side? Hope your users have a LOT of ram on all their stations. (Not putting down disconnected recordsets, they're wonderful, just pointing out a drawback).
12-06-2001, 09:27 AM
I agree, and would never use them for that many records. I have been on
record here in the past as saying I can't think of a good reason to ever pull
that many records at a time into a recordset. I deal with one main record
(and its associated detail records, if any) at a time. View, insert, update,
or delete, then go get the next record.
12-06-2001, 09:34 AM
If only life were that simple. Unfortunately, many applications don't work 'one record at a time'.