You are given an array of \(N\) positive integers and \(Q\) queries. In each query, you are given an index \(i\) \((1\leq i \leq N)\). Find an index \(j\) \((1\leq j \leq N)\) such that \((i\) & \(j) = 0\) and \(A_i + A_j\) is maximum. Find the maximum possible value \(A_i+A_j\) for each query. If no such index \(j\) exists, then the answer is \(-1\).
Note: Here, \(\&\) is a bitwise AND operator.
Input format
- The first line contains \(T\) denoting the number of test cases.
- The first line of each test case contains an integer \(N\) denoting the number of array elements.
- The next line contains \(N\) space-separated positive integers.
- The next line contains an integer \(Q\) denoting the number of queries that must be performed.
- Each of the next \(Q\) lines contains an integer \(i\).
Output format
For each test case, print \(Q\) lines where the \(i^{th}\) line must contain the output for the \(i^{th}\) query.
Constraints
\(1 \leq T \leq 20\)
\(1 \leq N, Q \leq 10^5\)
\(1 \leq A_i \leq 10^9\)
\(1 \leq Q_i \leq N\)
Please login to use the editor
You need to be logged in to access the code editor
Loading...
Please wait while we load the editor
Login to unlock the editorial