gaps-and-islands Questions

1

Solved

I have a UserSession table in PostgreSQL 9.6 that stores user's login and logout time, I want to calculate the maximum number of concurrent sessions - which are only considered to be concurrent if ...
Fishwife asked 17/12, 2019 at 14:28

1

Solved

I've a query like this in PostgreSQL: select count(id_student) students, date_beginning_course from data.sessions_courses left join my_schema.students on id_session_course=id_sesion where course...
Equipment asked 11/9, 2019 at 23:42

1

Solved

I'm querying a snapshot of customer data that contains the snapshot date, the customer ID and the 'value' of that customer on that day. I use the LAG function to return the previous days value to k...
Awe asked 10/7, 2019 at 13:38

4

Solved

My scenario started off similar to a Island and Gaps problem, where I needed to find consecutive days of work. My current SQL query answers "ProductA was produced at LocationA from DateA through Da...
Dignitary asked 2/5, 2019 at 15:41

2

Solved

Suppose I have this table from the result of a select * from journeys: timestamp | inJourney (1 = true and 0 = false) -------------------------------------------------- time1 | 1 time2 | 1 time3 |...
Otherworld asked 12/4, 2019 at 14:42

3

Solved

I'm using SQL Server 2008 R2. I have table called EmployeeHistory with the following structure and sample data: EmployeeID Date DepartmentID SupervisorID 10001 20130101 001 10009 10001 20130909 00...
Conservatism asked 12/11, 2013 at 5:10

2

Solved

Consider this list of dates as timestamptz: I grouped the dates by hand using colors: every group is separated from the next by a gap of at least 2 minutes. I'm trying to measure how much a giv...

4

Solved

Given the following table: CREATE TABLE channel1m ( ts TIMESTAMP WITHOUT TIME ZONE NOT NULL, itemId BIGINT, value BIGINT ) in which a row may be inserted each minute, per itemId, as follows: ...
Unity asked 5/12, 2012 at 17:2

1

Solved

I have been going through stack overflow to try and work this out over the last week and I still can't work out a viable solution so was wondering if anyone could offer me some help/advice? Explan...
Barbershop asked 19/2, 2019 at 16:37

2

Solved

Right now I just have an aggregate of how many days a user has worked. I'm trying to change this query to most continuous days worked. Where u12345 would be 4 and u1 would be 2. Is this possible ...
Kwang asked 16/1, 2019 at 21:44

1

Solved

I have this table and need to find gaps between intervals Records may be overlapping. user | start_time | end_time user1|2018-09-26T02:16:52.023453|2018-09-26T03:12:04.404477 user1|2018-09-25T22:...
Paule asked 1/10, 2018 at 7:45

4

Solved

How can I check if a range is completely covered by a set of ranges. In the following example: WITH ranges(id, a, b) AS ( SELECT 1, 0, 40 UNION SELECT 2, 40, 60 UNION SELECT 3, 80, 100 UNION S...
Moralist asked 18/9, 2018 at 10:0

3

I have a table with measurements. Measurement is done every minute. I need to select only rows having the same sample_value more than once consecutively for the same device_id. Here are initial da...
Ioneionesco asked 22/8, 2018 at 10:14

3

Solved

How do I get the following result highlighted in yellow? Essentially I want a calculated field which increments by 1 when VeganOption = 1 and is zero when VeganOption = 0 I have tried using the ...
Monophyletic asked 21/8, 2018 at 17:54

3

Solved

I have a problem, that I want to partition over a sorted table. Is there a way I can do that? I am using SQL Server 2016. Input Table: |---------|-----------------|-----------|------------| | pr...
Knob asked 28/6, 2018 at 8:23

3

Solved

I need to calculate value of some column X based on some other columns of the current record and the value of X for the previous record (using some partition and order). Basically I need to impleme...
Erst asked 17/12, 2015 at 16:1

5

Solved

With the following data create table #ph (product int, [date] date, price int) insert into #ph select 1, '20120101', 1 insert into #ph select 1, '20120102', 1 insert into #ph select 1, '20120103',...
Meow asked 11/4, 2012 at 16:26

2

My goal is to take a set of data that is ordered by id and return a resultset that indicates the number of consecutive rows where the val column is identical. E.g. given this data: | id | val | | ...
Canica asked 14/6, 2015 at 13:45

4

Solved

The issue here is related to another question I had... I have millions of records, and the ID of each of those records is auto-incremented, unfortunately sometimes the ID that is generated is som...
Typhogenic asked 9/12, 2011 at 17:58

3

Solved

I have a table that contains categories, dates and rates. Each category can have different rates for different dates, one category can have only one rate at a given date. Id CatId Date Rate -----...
Interscholastic asked 5/9, 2012 at 6:59

4

Using Postgres 9.3, I'm trying to count the number of contiguous days of a certain weather type. If we assume we have a regular time series and weather report: date|weather "2016-02-01";"Sunny" "2...
Tomlinson asked 18/2, 2016 at 20:29

2

Solved

I want, in a request, to fill all Null values by the last known value. When it's in a table and not in a request, it's easy: If I define and fill my table as follows: CREATE TABLE test_fill_null...
Kentonkentucky asked 16/12, 2015 at 14:5

1

Database queries, normally so simple, yet sometimes so difficult. (brain trainer) So I have products, stocks and rentStockOrders. These products can be rented for a set of days. The stocks also ha...
Pianist asked 2/12, 2015 at 22:1

4

Solved

In PostgreSQL (9.3) I have a table defined as: CREATE TABLE charts ( recid serial NOT NULL, groupid text NOT NULL, chart_number integer NOT NULL, "timestamp" timestamp without time zone NOT NUL...
Frivolous asked 20/9, 2015 at 21:14

2

Solved

CREATE TABLE entries ( id serial NOT NULL, title character varying, load_sequence integer ); and data INSERT INTO entries(title, load_sequence) VALUES ('A', 1); INSERT INTO entries(title, loa...
Bullard asked 2/9, 2015 at 9:47

© 2022 - 2024 — McMap. All rights reserved.