Forum Discussion
The Diagonal Suite: Gentle thunking goes a long way!
I finished up copy/pasting your text to the workbook module!
I returned the reverse diagonals individually for summation (as used for accounts receivable or depreciation schedules). I hadn't realised GitHub can return binary files.
I had considered including Traverseλ in the signature of ByDiagλ but pulled back because I thought anti-diagonal aggregations would not be common (and it would be one too many parameters). There seems to be a dearth of good diagonally arranged examples online (Unless I'm not searching with the right terms). I'll have a look at accounts receivable and depreciation schedules. Thank you!
- PeterBartholomew1Nov 01, 2025Silver Contributor
I think your use of thunks may well be closer to the original intent than mine. Although you define the code to generate multiple arrays, you only evaluate the ones selected by the use case. I think such 'code insertion' was central to the concept of thunks (any functional programmers out there might like to comment).
I have used thunks in a rather different manner. I actually use a LET variable to force the evaluation of the content before forming the thunk. My mindset is more one on "I may reference this data many times and wish to ensure that the processing is only performed once". I am, in effect, passing data by reference and avoid the prospect of disappearing down the rabbit hole performing recursive calculation as I go!
I find it very difficult to determine which strategy should perform the better and suspect it is case-specific.
The examples of anti-diagonal (weighted) summations I quoted, fit into a pattern of discrete convolutions. Ultimately one can resort to Fast Fourier Transforms but that is far removed from the benefits of thunking you set out to convey!
BTW, I really enjoy your coding!
- Patrick2788Nov 01, 2025Silver Contributor
Thank you for taking a look! I've been taking a lot of my unfinished projects and rolling them into polished generalized functions lately. At one time I was determined to find an elegant way to unwrap a row of thunks, but I think you have that covered! (I'll need to revisit that module soon. My methods tend to change over time as I write more functions and learn new things.)
The eye opener was messing with Spiralλ in thunking all sequencing and arithmetic wherever possible and watching the calc time go down.
The analogy I like to think of with using thunks in this manner is buying food in bulk at the grocery store. It's not mentioned at checkout usually, but the cost of the bag's weight (tare weight) is subtracted from the cost of food when determining the price. The cost of using LET ("the bag") and not heavily-nested direct evaluation is slightly slower calc times, but we never notice it with small data sets. The gentle thunks give us back the tare weight.