Given two sorted arrays of integers A1, A2, with the same length n and an integer x, I need to write an algorithm that runs in O(nlog(n)) that determines whether there exist two elements a1, a2 (one element in each array) that make a1+a2=x.
At first I thought about having two index iterators i1=0, i2=0 (one for each array) that start from 0 and increase one at a time, depending on the next element of A1 being bigger/smaller than the next element of A2. But after testing it on two arrays I found out that it might miss some possible solutions...