The practical task is being able to return a Stream by reading from a file, but doing so without loading the whole file (or the parsed collection) entirely in memory. The purpose for that stream may be determined later -- e.g. save to DB. The developer would have a handle to a deserialized stream (instead of a deserialized collection).
The issue with that is that there is no guarantee that one line from the file equals one MyEntity object (in which case I could have used this article: http://blog.codeleak.pl/2014/05/parsing-file-with-stream-api-in-java-8.html)
In general, one may find a situation where, provided an input stream, one needs to return an output stream constructed by mapping a variable number of input stream items to one output stream item.
So, my solution so far is by using a Supplier, like this:
public class Parser{
    public Stream<MyEntity> parse(final InputStream stream) {
        return Stream.generate(new AggregatingSupplier(stream));
    }
    private class AggregatingSupplier implements Supplier<MyEntity> {
        private final Scanner r;
        public AggregatingSupplier(final InputStream source) {
            this.r= new Scanner(source);
        }
        @Override
        public MyEntity get() {
            MyEntity re=new MyEntity();
            while (r.hasNextLine() &&!re.isComplete()){
                String line=r.nextLine();
                // ... do some processing
            }
            return re;
        }
    }
}
The problem with this approach is that the stream obtained with Stream.generate is infinite. There is no stop condition. Throwing an exception works (somewhat). Or choosing a completely different (classical) approach.