I'm new to Streamr, so I don't know how it's architected and therefore I don't know its true limitations, but from my cursory glance, it initially seems like the best platform upon which to build the following prediction streams idea:
The internet, indeed, the world is, in some sense made up of data streams. The community here at Streamr is, I'm sure, familiar with this view. In this light it seems almost natural to predict those data streams, to extrapolate them into their futures.
The inputs to any function aimed at predicting a data stream would obviously include the stream itself as well as perhaps many other data streams. These other data streams being representative of processes that for whatever reason, and in various contexts become consistently correlated with the behavior of the target data stream.
Understanding and detecting these correlations and their contexts may seem difficult but could be accomplished incrementally, even consistently, through static algorithms.
If the extrapolation of one stream given only its history (most of the technical analysis done today) is valuable, then the extrapolation of one stream given many streams must be much more valuable. However, the extrapolation of all streams gives another leap in exponential value, as predictions begin to reinforce and correct one another.
For example, the predictor of stream A gains access to additional secondhand information with a much larger footprint than can be processed locally by discovering that stream A, in the current context, is closely correlated with not stream B, but the stream representing a predicted change in stream B. the predictor of stream A and the predictor of stream B do not share (all) the same inputs when making their predictions. In other words, by correlating the two the predictor of stream A gains the benefit of leveraging the computation and bandwidth resources already expended when predicting stream B.
Once a critical amount of data streams are being predicted such that they reinforce and correct one another, a new layer can be built - data streams containing predictions of prediction data streams. The ultimate end of such a system can't be known, but we can imagine one of the initial benefits to be a kind of 'google of the future' service since essentially all known information has already been naturally aggregated into a structure that predicts the future of every data stream in the system.
The question of how this system evolves after that becomes more complex when we realize that the best prediction of the future on and given everything is then information used to make decisions in order to change aspects of that future. Thus I do not believe we can know exactly where this chaotic feedback loop will lead but what has been described is essentially a global proto-brain as it is the brain's main focus on the smallest scales (neuronal mini-columns) to predict the future of their own activation: ultimately, the future of sensory input data streams.
So, to begin to evolve this global proto-brain, we need, first of all, the infrastructure to broadcast data streams. This is Streamr; decentralized, p2p (essential for reducing bandwidth limitations), integrated with a resource accounting mechanism (DATA token), etc. Other possible platforms may include SingularityNet, Ocean Protocol, or even the pseudo-centralized solutions such as Hashgraph or Iota. Or a network could be built for this purpose explicitly, made to utilize simple Ethereum smart contracts for its distributed asset management needs.
Secondly, we need an evolving protocol/API specifically designed to make communication about prediction streams efficient. Much of the generic protocol used for all kinds of data streams is useful as a prediction stream is, after all, a data stream. Still, there is specific information unique to prediction streams that can be baked into a description of the stream.
Thirdly, we need an evolving generic algorithm that can be used by any naive predictor that is agnostic to the data stream type that it is predicting. This algorithm needs to tend towards better predictions over time, in much the same way evolutionary algorithms do. We can imagine it's minimal viable design being as simple as randomly sampling the space of all data streams until it finds data streams that are more correlated with its target data stream than anything it's seen before and using this, it's current favorite set of data streams, to extrapolate the future.
Lastly, we need an economic impetus for it to exist. Those that participate in the creation of this proto-brain need be compensated for their resource expenditure to that end. They must be able to collectively charge for their efforts, or, perhaps more naturally, they must rely on the increase in the value of the token used natively to economically direct their efforts. It is my hope that perhaps through the use of Data Unions and the Streamr marketplace this necessary component of an incentive structure can be put in place natively on the Streamr network.
As I see them, these are the four basic components needed, to produce a prediction network full of prediction streams. It's just the beginning of an idea, but I believe it's an inevitable one, and what better place to build it than on Streamr?