I'm not quite sure what you mean by "stopping agents" in this context. The only "agent" I can think of in this context would be an Enterprise Manager agent but I doubt that's what you're talking about and I don't see why you'd need a long-running query for that.
The simplest way to force a query to run for a long time is to have it sleep for a bit using the dbms_lock.sleep
procedure. Something like
CREATE OR REPLACE FUNCTION make_me_slow( p_seconds in number )
RETURN number
IS
BEGIN
dbms_lock.sleep( p_seconds );
RETURN 1;
END;
which you can call in your query
SELECT st.*, make_me_slow( 0.01 )
FROM some_table st
That will call make_me_slow
once for every row in some_table
. Each call to make_me_slow
will take at least 0.01 seconds. If some_table
has, say, 10,000 rows, that would take at least 100 seconds. If it has 100,000 rows, that would take 1,000 seconds (16.67 minutes).
If you don't care about the results of the query, you can use the dual
table to generate the rows so that you don't need a table with materialized rows. Something like
SELECT make_me_slow( 0.01 )
FROM dual
CONNECT BY level <= 20000
will generate 20,000 rows of data and take at least 200 seconds.
If you want a pure SQL query (which gives you less ability to control exactly how long it's going to run),
select count(*)
from million_row_table a
cross join million_row_table b
will generate a 1 million x 1 million = 1 trillion row result set. That's likely to run long enough to blow out whatever TEMP
tablespace you have defined.
CROSS JOIN
? – Infamous