Once again, this question was bumped to front :( Before I made a comment like:
It is unfortunate that this question from 6 years back has been bumped
to front page :( The reason is that the question is starving for some
explanations and the design is screaming of flaws. Without knowing
those details no answer would be good. 1) Never ever rely on the old
enginebehavior to workaround this error. It was a BUG and corrected.
Relying on a bug is not the way to go. 2) To create a date value,
never ever do that by converting from a string. That is settings
dependent. Better either use solid date(), datetime() functions or
strict date\datetime literals.
That comment stands and there are more, anyway.
Let's add more about the flaws and a reply based on what I understand from the question.
In the question OP says:"
SELECT vn.type, SUM(vn.obal + IIF(v.date < sd, v.credit-v.debit, 0)) OpBal, ;
SUM(IIF(BETWEEN(v.date, sd, ed), v.credit-v.debit, 0)) CurBal ;
FROM v, vn WHERE v.name = vn.name GROUP BY vn.type ;
ORDER BY vn.type HAVING OpBal + CurBal != 0
works fine."
But of course, any seasoned developer using SQL, would immediately see that it would not work fine, the results could only be fine coincidentally. Let's make it clear why it wouldn't really work. First let's create some cursors describing the OP's data:
CREATE CURSOR vn ( date d, name c(25), type c(25), obal n(7) )
INSERT INTO vn (date, name, type, obal) VALUES ( DATE(2012,5,20), "abc", "bank", 100 )
INSERT INTO vn (date, name, type, obal) VALUES ( DATE(2012,5,20), "def", "bank", 200 )
INSERT INTO vn (date, name, type, obal) VALUES ( DATE(2012,5,20), "ghi", "bank", 300 )
INSERT INTO vn (date, name, type, obal) VALUES ( DATE(2012,5,20), "xyz", "ledger", 400 )
INSERT INTO vn (date, name, type, obal) VALUES ( DATE(2012,5,20), "pqr", "ledger", 500 )
CREATE CURSOR v ( date d, name c(25), desc c(50), debit n(7), credit n(7))
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,1), "abc", "description 1", 50, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,2), "abc", "description 1", 60, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,3), "abc", "description 1", 70, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,1), "def", "description 1", 50, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,2), "def", "description 1", 60, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,3), "def", "description 1", 70, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,1), "ghi", "description 1", 50, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,2), "ghi", "description 1", 60, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,4), "xyz", "description 1", 50, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,5), "xyz", "description 1", 60, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,6), "pqr", "description 1", 50, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,7), "pqr", "description 1", 60, 0 )
INSERT INTO V (date,name,desc,debit,credit) VALUES ( DATE(2012,6,8), "pqr", "description 1", 70, 0 )
Let's run OP's query on this data:
LOCAL sd,ed
sd = date(2012,6,1) && start date of transactions
ed = DATE() && current date as the end date...
SELECT vn.type, SUM(vn.obal + IIF(v.date < sd, v.credit-v.debit, 0)) OpBal, ;
SUM(IIF(BETWEEN(v.date, sd, ed), v.credit-v.debit, 0)) CurBal ;
FROM v, vn WHERE v.name = vn.name GROUP BY vn.type ;
ORDER BY vn.type HAVING OpBal + CurBal != 0
We get a result like:
TYPE OPBAL CURBAL
------- ----- ------
bank 1500 -470
ledger 2300 -290
which is obviously incorrect. With a query like this, the more you get credit or debit, the more you have an opening balance. Let's see why is that happening, removing the group by and aggregation, check what we are really summing up:
SELECT vn.type, vn.name, v.date, vn.obal, v.credit, v.debit ;
FROM v, vn ;
WHERE v.name = vn.name
Output:
TYPE NAME DATE OBAL CREDIT DEBIT
bank abc 06/01/2012 100 0 50
bank abc 06/02/2012 100 0 60
bank abc 06/03/2012 100 0 70
bank def 06/01/2012 200 0 50
bank def 06/02/2012 200 0 60
bank def 06/03/2012 200 0 70
bank ghi 06/01/2012 300 0 50
bank ghi 06/02/2012 300 0 60
ledger xyz 06/04/2012 400 0 50
ledger xyz 06/05/2012 400 0 60
ledger pqr 06/06/2012 500 0 50
ledger pqr 06/07/2012 500 0 60
ledger pqr 06/08/2012 500 0 70
You can see that, say for 'abc' OBal 100 is repeated 3 times, because of the 3 entries in v. Summing would make it 300 when it is just 100.
When you are using an aggregate like SUM() or AVG(), you should do the aggregation first without a join, then do your join. You can still do the aggregation, with a join, PROVIDED, the join results in a 1-to-many relation. If the above resultset were:
TYPE NAME OBAL CREDIT DEBIT
bank abc 100 0 180
bank def 200 0 180
bank ghi 300 0 110
ledger xyz 400 0 110
ledger pqr 500 0 180
it would be OK to SUM() BY type (1 side of 1-to-Many).
Having said that and adding VFP supports subqueries let's write a solution:
Local sd,ed
sd = Date(2012,6,1) && start date of transactions
ed = Date() && current date as the end date...
Select vn.Type, Sum(vn.OBal - tmp.TotCd) As Opbal, Sum(tmp.Curbal) As Curbal ;
FROM vn ;
LEFT Join ;
(Select v.Name, Sum(Iif(v.Date < sd, v.credit-v.debit, 0)) TotCd, ;
SUM(Iif(Between(v.Date, sd, ed), v.credit-v.debit, 0)) Curbal ;
FROM v ;
GROUP By v.Name ) tmp On tmp.Name = vn.Name ;
GROUP By vn.Type ;
ORDER By vn.Type ;
HAVING Sum(vn.OBal - tmp.TotCd + tmp.Curbal) != 0
we get what we want:
TYPE OPBAL CURBAL
------- ----- ------
bank 600 -470
ledger 900 -290
vn
table and then join that result to the query you have. – AbramsonGROUP BY vn.type
query. With aGROUP BY type, obal
, I think your answer is fine. (By the way, is thisHAVING
afterORDER BY
valid syntax in Foxpro? It makes me wonder) – Abramsonvn
isname
, by the description. – Abramsonname
, and nottype
. I think I just sawname
where I should have seentype
. – CrawCurCr
andCurDb
? You are referencing them in theHAVING
clause but they are nowhere else to be seen. Where do they come from? Are they columns or variables? – Impedance