SCM

[#1010458] DataReader with CommandBehavior.SchemaOnly can be slow

View Trackers | Bugs | Download .csv | Monitor

Date:
2008-08-18 09:48
Priority:
3
State:
Open
Submitted by:
Andrus Moor (kobruleht)
Assigned to:
Nobody (None)
Npgsql Version:
2.0rc1
Category:
Group:
Resolution:
None
Summary:
DataReader with CommandBehavior.SchemaOnly can be slow

Detailed description
DataReader with CommandBehavior.SchemaOnly can probably be slow since it forces
PostgreSQL to prepare data for sending to client.

To reproduce, run code below.
Observed:

1. "select * from pg_tables" command is sent to server.
2. Server is not informed that only structure is required.

Adding limit 0 automatically to query is not a solution because query may already contain limit clause.

using System.Data;
using Npgsql;

class Program
{
static void Main()
{
const string s = "SERVER=localhost;USER=postgres";
const string query = @"select * from pg_tables";
var con1 = new NpgsqlConnection(s);
IDbCommand cmd = con1.CreateCommand();
cmd.CommandText = query;
con1.Open();
var dr = cmd.ExecuteReader(CommandBehavior.SchemaOnly);
var n = dr.GetName(0);
}
}

Followup

Message
Date: 2008-12-18 13:15
Sender: Josh Cooley

Prepare does a similar thing. If you send a parse and then describe message to the server, it will return just the row description. The last couple of releases have been about correctness, so this fix hasn't made it in yet.

The NpgsqlCommand.Prepare method already shows how to do a parse and describe, so it should be just adding similar code to ExecuteReader (or GetReader).
Date: 2008-12-18 08:34
Sender: Andrus Moor

"I'm pretty sure that the only safe way to do this is to
add limit 0 yourself "

There was discussion about it in forum. Josh wrote that he
can found solution for this.
So I'm sure thare is better way.
Date: 2008-12-18 00:27
Sender: Jon Hanna

You're correct that adding limit 0 could fail in some cases.

We used to wrap it with select * from ( [user code] ) limit 0; which is safe in a large number of cases, but would still fail in some cases. In particular you should be able to run schema-only ExecuteReader() calls with queries that return more than one recordset and allow the user to examine each on NextResult(). IMO always correct but sometimes slow beats usually fast but sometimes throwing an error, so I think it's better that we don't have this any more.

I'm pretty sure that the only safe way to do this is to add limit 0 yourself :(

Attached Files:

Changes:

No Changes Have Been Made to This Item

Powered By FusionForge