Discussion:
Limit Rows in DataTable
Brady Kelly
2008-04-29 09:12:46 UTC
Permalink
I have an application under test, where I iteratively reset an export batch
and then run the export for different configurations. For major testing, I
wish to test each config over the full monty, 70k documents exported, but
for dev testing I like to keep the rows in each section down to 500. Last
night in the interests of not breaking code, I was popping 'TOP 500' into
each query, but I'm sure there is a better way. Any suggestions will be
appreciated. I'm using a SqlDataAdapter to fill my DataTable objects.


===================================
This list is hosted by DevelopMentor� http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com
Greg Robinson
2008-04-29 12:15:23 UTC
Permalink
Be careful with this. We recently had a customer do an emailing with 22,000
rows in a datatable in a dataset. The dataset simply is not designed for
this amount of records and it was SLOW. We broke it up into smaller datasets
of about 500 record each.



Greg

-----Original Message-----
From: Discussion of development on the .NET platform using any managed
language [mailto:DOTNET-***@DISCUSS.DEVELOP.COM]On Behalf Of Brady Kelly
Sent: Tuesday, April 29, 2008 5:13 AM
To: DOTNET-***@DISCUSS.DEVELOP.COM
Subject: [DOTNET-CLR] Limit Rows in DataTable


I have an application under test, where I iteratively reset an export batch
and then run the export for different configurations. For major testing, I
wish to test each config over the full monty, 70k documents exported, but
for dev testing I like to keep the rows in each section down to 500. Last
night in the interests of not breaking code, I was popping 'TOP 500' into
each query, but I'm sure there is a better way. Any suggestions will be
appreciated. I'm using a SqlDataAdapter to fill my DataTable objects.


===================================
This list is hosted by DevelopMentor. http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com

===================================
This list is hosted by DevelopMentor� http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com
Brady Kelly
2008-04-29 13:45:49 UTC
Permalink
Post by Greg Robinson
Be careful with this. We recently had a customer do an emailing with 22,000
rows in a datatable in a dataset. The dataset simply is not designed for
this amount of records and it was SLOW. We broke it up into smaller datasets
of about 500 record each.
Greg
I'm only using high numbers for testing, e.g. I'm testing a whole two year's
invoices, where the normal load is monthly. I'll normally be looking at a
few hundred for the foreseeable future. It shows a small lag on loading the
data with of to 20k documents, but processes them lightning fast.

===================================
This list is hosted by DevelopMentor� http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com
Marc Brooks
2008-04-29 17:00:38 UTC
Permalink
Post by Greg Robinson
Be careful with this. We recently had a customer do an emailing with 22,000
rows in a datatable in a dataset. The dataset simply is not designed for
this amount of records and it was SLOW. We broke it up into smaller datasets
of about 500 record each.
I'll second this, and remind people that <singing> Anything you can
do with a DataSet, I can do with a DataReader faster </singing>

My usual pattern these days is to pass a delegate that takes the
SqlDataReader and new up the DTO and parties on... then call that guy
inside the loop body in my DAL. (which prevents Dispose-forgotten
leaks). Something like this in VB for variety :):

Public Interface ILoadFromReader
Sub LoadFromReader(ByVal reader As SqlDataReader)
End Interface

Public Function ExecuteDataReader(Of T As {New,
ILoadFromReader})(ByVal connection As SqlConnection, ByVal commandType
As CommandType, ByVal commandText As String, ByVal handler As
HandleOne(Of T), ByVal ParamArray commandParameters() As
SqlClient.SqlParameter) As Long

If (connection Is Nothing) Then Throw New
ArgumentNullException("connection")
' Create a command and prepare it for execution
Dim cmd As SqlCommand = New SqlCommand
Dim mustCloseConnection As Boolean = False

Try
PrepareCommand(cmd, connection, CType(Nothing,
SqlTransaction), commandType, commandText, commandParameters,
mustCloseConnection)

Dim count As Long = 0
Dim reader As SqlDataReader = Nothing
Try
reader = cmd.ExecuteReader()

If reader.HasRows Then

While reader.Read()
count = count + 1
Dim entry As T = New T()
entry.LoadFromReader(reader)
If Not handler(entry) Then Exit While
End While
End If
Finally
If Not (reader Is Nothing) Then reader.Dispose()
End Try

' Detach the SqlParameters from the command object, so
they can be used again
cmd.Parameters.Clear()

Return count
Finally
If (mustCloseConnection) Then connection.Close()
End Try
End Function

--
"Your lack of planning DOES constitute an emergency on my part... so
PLAN BETTER! "

Marc C. Brooks
http://musingmarc.blogspot.com

===================================
This list is hosted by DevelopMentor� http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com
Tony Bermudez
2008-04-30 00:41:20 UTC
Permalink
Is there any way that you can post the C# equivalent code?

Thanks,
Tony

-----Original Message-----
From: Discussion of development on the .NET platform using any managed language [mailto:DOTNET-***@DISCUSS.DEVELOP.COM] On Behalf Of Marc Brooks
Sent: Tuesday, April 29, 2008 12:01 PM
To: DOTNET-***@DISCUSS.DEVELOP.COM
Subject: Re: [DOTNET-CLR] Limit Rows in DataTable
Post by Greg Robinson
Be careful with this. We recently had a customer do an emailing with 22,000
rows in a datatable in a dataset. The dataset simply is not designed for
this amount of records and it was SLOW. We broke it up into smaller datasets
of about 500 record each.
I'll second this, and remind people that <singing> Anything you can
do with a DataSet, I can do with a DataReader faster </singing>

My usual pattern these days is to pass a delegate that takes the
SqlDataReader and new up the DTO and parties on... then call that guy
inside the loop body in my DAL. (which prevents Dispose-forgotten
leaks). Something like this in VB for variety :):

Public Interface ILoadFromReader
Sub LoadFromReader(ByVal reader As SqlDataReader)
End Interface

Public Function ExecuteDataReader(Of T As {New,
ILoadFromReader})(ByVal connection As SqlConnection, ByVal commandType
As CommandType, ByVal commandText As String, ByVal handler As
HandleOne(Of T), ByVal ParamArray commandParameters() As
SqlClient.SqlParameter) As Long

If (connection Is Nothing) Then Throw New
ArgumentNullException("connection")
' Create a command and prepare it for execution
Dim cmd As SqlCommand = New SqlCommand
Dim mustCloseConnection As Boolean = False

Try
PrepareCommand(cmd, connection, CType(Nothing,
SqlTransaction), commandType, commandText, commandParameters,
mustCloseConnection)

Dim count As Long = 0
Dim reader As SqlDataReader = Nothing
Try
reader = cmd.ExecuteReader()

If reader.HasRows Then

While reader.Read()
count = count + 1
Dim entry As T = New T()
entry.LoadFromReader(reader)
If Not handler(entry) Then Exit While
End While
End If
Finally
If Not (reader Is Nothing) Then reader.Dispose()
End Try

' Detach the SqlParameters from the command object, so
they can be used again
cmd.Parameters.Clear()

Return count
Finally
If (mustCloseConnection) Then connection.Close()
End Try
End Function

--
"Your lack of planning DOES constitute an emergency on my part... so
PLAN BETTER! "

Marc C. Brooks
http://musingmarc.blogspot.com

===================================
This list is hosted by DevelopMentor� http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com

===================================
This list is hosted by DevelopMentor� http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com
Loading...