grouping Questions

1

Many places here on StackOverflow, this question has been asked and answered. However, I have found most of those, while technically correct, leave out some specific details that not only explain w...
Erin asked 20/1, 2020 at 21:16

3

I am trying to calculate a new column which contains maximum values for each of several groups. I'm coming from a Stata background so I know the Stata code would be something like this: by group, ...
Premarital asked 25/2, 2016 at 23:16

5

Solved

As is typical for grids in a "dashboard" application that monitors dynamically changing data, my (Telerik) Kendo UI grid is refreshed with fresh data periodically, every 60 seconds: grid.dataSourc...
Amorphism asked 19/12, 2012 at 13:17

3

Solved

I'm working with 3D pointcloud of Lidar. The points are given by numpy array that looks like this: points = np.array([[61651921, 416326074, 39805], [61605255, 416360555, 41124], [61664810, 4163137...
Olsewski asked 8/12, 2019 at 21:2

2

Solved

I need to programmatically apply different functions to different columns and group by, using data.table. If the columns and functions were known, I would do like this: library(data.table) DT = ...
Abstractionist asked 26/11, 2019 at 21:39

1

Solved

I have a dataframe of detection events with columns providing an individual's Tag ID (Tag), Detection Start Time (StartDateTime_UTC), Detection End Time (EndDateTime_UTC), and location. I'd like t...
Lupelupee asked 30/10, 2019 at 15:14

3

Solved

SELECT Count(1) AS total, 'hello' AS filter, field1 AS field1, Count(DISTINCT field2) AS total_field2 FROM table WHERE field = true AND status = 'ok' GROUP BY field1 Doubts how to make a m...
Himalayas asked 12/10, 2019 at 18:41

2

Solved

I have been having issues with what seems to be a simple thing to do: grouped boxplots with a continuous x axis. Here is come minimal data data: df <- cbind(expand.grid(x=1:10, rep=1:20, fill=...
Rejoinder asked 14/7, 2017 at 10:43

4

Solved

I want to learn how to use the Java 8 syntax with streams and got a bit stuck. It's easy enough to groupingBy when you have one key for every value. But what if I have a List of keys for every val...
Feleciafeledy asked 2/5, 2014 at 7:34

2

Solved

Suppose I have the following DataFrame: df = pd.DataFrame({'Event': ['A', 'B', 'A', 'A', 'B', 'C', 'B', 'B', 'A', 'C'], 'Date': ['2019-01-01', '2019-02-01', '2019-03-01', '2019-03-01', '2019-02-...
Stentor asked 24/9, 2019 at 7:16

3

Solved

I have groupings of values in the data and within each group, I would like to check if a value within the group is below 8. If this condition is met, the entire group is removed from the data set. ...
Bilocular asked 9/1, 2016 at 7:9

3

Is there a way to add a column groupings? For example: Unit 1 | Unit 2 | Pre Mid Post | Pre Mid Post | --- --- ---- | --- --- ---- | 2 4 5 | 3 4 4 | 1 2 4 | 3 4 5 | Basically, I need a heade...
Nicolas asked 2/9, 2011 at 15:21

5

Solved

I have an unsorted list of integer tuples such as: a = [(1, 1), (3, 1), (4, 5), (8, 8), (4, 4), (8, 9), (2, 1)] I am trying to find a way to group up the "recursively adjacent" tuples. &...
Aixlachapelle asked 22/7, 2019 at 12:36

4

Solved

I have a big data frame with a column, with a group name, which is grouped with dplyr. So multiple rows have the same group name. To reduce the data, I would like to extract every nth element start...
Monopetalous asked 16/7, 2019 at 23:8

1

Solved

I've asked a question before about enhancing some code, here. @Holger gives me the right response and he said that: Whenever you find yourself using the reducing collector with groupingBy, you ...
Efferent asked 15/7, 2019 at 14:27

2

Solved

I have a set of JSON array and I want the result to be grouped by the "Id" column. I will not use underscore.js for this, as this can't be used in our project. The only option is to do it with jQue...
Phenomenalism asked 17/6, 2015 at 13:52

2

So it is pretty straight forward. I need a way to group cells together. Like a <div> or a <span> but none of them worked. <tbody> seemed like a good solution but it only works for...
Selfpronouncing asked 23/3, 2012 at 22:42

2

Solved

I am struggeling to calculate the percent difference between the annual net sales for a company, with taken into account NA's. Here's an sample of the data: dt <- data.table(lpermno = c(10065,...
Shemikashemite asked 23/6, 2019 at 12:51

3

Solved

I can't find an exact answer to this problem, so I hope I'm not duplicating a question. I have a dataframe as follows groupid col1 col2 col3 col4 1 0 n NA 2 1 NA NA 2 2 What I'm trying to co...
Reede asked 19/7, 2017 at 21:51

7

Solved

I have this code: from itertools import groupby from itertools import combinations teams = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] combo = list(combinations(teams, 2)) The output is a list of 45 tuple...
Bulbous asked 3/6, 2019 at 13:41

1

Solved

The Issue Given the following network of nodes and edges, I would like to derive all possible groupings of nodes where all nodes within a group are connected to all other nodes within that group ...
Consulate asked 29/4, 2019 at 20:25

5

Solved

What is the "rails way" to efficiently grab all rows of a parent table along with a count of the number of children each row has? I don't want to use counter_cache as I want to run these ...
Stanwood asked 28/4, 2009 at 3:48

3

Solved

I've got a graph that I want graphviz to layout and visualize for me. The graph has 122 edges and 123 nodes. The edges are of 4 different kinds and I want them to be visually distinguishable. Howev...
Facia asked 27/6, 2010 at 20:50

6

Solved

I'm looking for an efficient way to identify spells/runs in a time series. In the image below, the first three columns is what I have, the fourth column, spell is what I'm trying to compute. I've t...
Purveyor asked 1/4, 2019 at 20:44

3

Solved

I have a list of Items where each Item can belong to one or more category. For a limited set of categories(string) I want to create a map with category as key and list of Items as value. Assume m...
Instability asked 27/3, 2019 at 11:40

© 2022 - 2024 — McMap. All rights reserved.