I'm trying to fix the currently broken seeking ability of VPlayer which is a FFmpeg player for Android. Being a Java developer, C code looks like alien language to me so can only fix it using common logic (which could make any C veteran have a good laugh).
The relevant file is player.c and I'll try my best to point out the relevant modification.
So the basic idea is because FFmpeg's av_seek_frame is very inaccurate even with AVSEEK_FLAG_ANY so I'm trying to follow this suggestion to seek backward to the nearest keyframe and then decode to the frame I want. One addition note is since I want to seek based on millisecond while the said solution show the way to seek by frame which is potentially a source of problem.
In the Player I add the following fields:
struct Player{
....
AVFrame *frame;
int64_t current_time_stamp;
}; 
In the player_read_from_stream I modify the seeking part as:
void * player_read_from_stream(void *data) {
    ...
    struct DecoderData *decoder_data = data;
    int stream_no = decoder_data->stream_no;
    AVCodecContext * ctx = player->input_codec_ctxs[stream_no];
    ...
    // seeking, start my stuff
    if(av_seek_frame(player->input_format_ctx, seek_input_stream_number, seek_target, AVSEEK_FLAG_BACKWARD) >= 0){
        //seek to key frame success, now need to read every frame from the key frame to our target time stamp
        while(player->current_time_stamp < seek_target){
            int frame_done;
            while (av_read_frame(player->input_format_ctx, &packet) >= 0) {
                if (packet.stream_index == seek_input_stream_number) {
                    avcodec_decode_video2(ctx, player->frame, &frame_done, &packet);
                    LOGI(1,"testing_stuff ctx %d", *ctx);
                    if (frame_done) {
                        player->current_time_stamp = packet.dts;
                        LOGI(1,"testing_stuff current_time_stamp: %"PRId64, player->current_time_stamp);
                        av_free_packet(&packet);
                        return;
                    }
                }
                av_free_packet(&packet);
            }
        }
    }
    //end my stuff
    LOGI(3, "player_read_from_stream seeking success");
    int64_t current_time = av_gettime();
    player->start_time = current_time - player->seek_position;
    player->pause_time = current_time;        
}
And in player_alloc_frames I allocate the memory for my frame as:
int player_alloc_frames(struct Player *player) {
    int capture_streams_no = player->caputre_streams_no;
    int stream_no;
    for (stream_no = 0; stream_no < capture_streams_no; ++stream_no) {
        player->input_frames[stream_no] = av_frame_alloc();
        //todo: test my stuff
        player->frame = av_frame_alloc();
        //end test
        if (player->input_frames[stream_no] == NULL) {
            return -ERROR_COULD_NOT_ALLOC_FRAME;
        }
    }
    return 0;
}
Currently it just keep crashing and being a typical Android NDK's "feature", it just provide a super helpful stack trace:
libc: Fatal signal 11 (SIGSEGV), code 1, fault addr 0x40 in tid 2717 (FFmpegReadFromS)
I very much appreciate if anyone could help me solve this problem. Thank you for your time.