@RMartinhoFernandes Please link to a message. "What" out of context takes way too many CPU brain cycles to process and the ambiguity simply cannot be resolved.

@LucDanton @RMartinhoFernandes I noticed the discussion about C# yield return. I don't know if someone mentioned it, but Boost Asio comes with some pretty powerful Stackless Coroutines which introduces exactly this kind of yield syntax/semantics in C++

The proposed boost::coroutine library provides stackful coroutines (vs the stackless ones I showed).
The advantage of having a stack is that you can yield from a nested function, which means you can layer non-async-aware APIs (e.g. a boost.spirit parser) over the top of async calls.
However, one of the disadvantages is that you have a stack :)
You can't transparently implement composed operations using stackful coroutines because you have to pass the coroutine's "self" reference to the function. (Well actually you could do it by creating a new coroutine stack for each composed operation,…

@thecoshman Once we're at awesome, might I point out the JuraScope? It's a kind of digital telescopes in the dino hall, which you can point at dinos, and which will then transform them... youtube.com/watch?v=61XSDdpjirQ

@RMartinhoFernandes The Brachiosaurus brancai in Berlin is famous for being the biggest dino skeleton being on display. Many of the "small ones" you see hanging around it are almost as big as Dippy. :) And on the wall beside it is, among many others, the famous original(!) archaeopteri fossil.

@thecoshman Maybe I was a bit too late for that. What with me having to clear the airport, get to Victoria, leaving the big bag there, and unciphing the London public transport system far enough to take me to Kensington, it was 10am before I arrived. That might have been the ideal time for school classes to arrive, too.

fuck, I fucking think there just fucking might be a fucking way to fucking do what you fucking want, but fucked if I'm fucking going to fucking answer a fucking question that fucking introduces it fucking self with fuck

@thecoshman haha, of course I had. there's nothing intrinsically wrong about that particular approach, but I want to see whether compile-time training translates into run-time efficiency boosts, since it's the exact same framework

@thecoshman for batch algorithms that consider the full training set each time a sample is encountered, there's a clear efficiency gain by effectively hard-coding the values in the program (or rather, having the compiler do it for you...which is the point of metaprogramming)

I would like to reuse code by writing a proto transform which is templated by a function pointer:
template <typename Ret, typename A0, typename A1, Ret func(A0,A1)>
struct apply_func : proto::callable
{
// Do something with func
};
However, the function itself is polymorphic so I do no...