Optimizing Java Algorithms: Fast I/O, Greedy Strategies, and Dynamic Programming Patterns
Greedy Interval Scheduling
When selecting the maximum number of non-overlapping events from a given set, a greedy strategy based on finish times yields an optimal solution. Sorting intervals by their end points ensures that each selected event leaves the maximum possible remaining time for subsequent choices.
A common implementation pitfall involves using java.util.Scanner for large datasets ($N \le 10^6$), which frequently results in Time Limit Exceeded (TLE) errors. Replacing standard input parsing with a buffered tokenizer drastically reduces overhead.
import java.io.*;
import java.util.*;
public class IntervalScheduler {
static class Session implements Comparable<Session> {
int start, end;
Session(int s, int e) { start = s; end = e; }
@Override
public int compareTo(Session other) {
return Integer.compare(this.end, other.end);
}
}
public static void main(String[] args) throws IOException {
FastInput reader = new FastInput();
int total = reader.nextInt();
Session[] events = new Session[total];
for (int i = 0; i < total; i++) {
events[i] = new Session(reader.nextInt(), reader.nextInt());
}
Arrays.sort(events);
int selected = 0;
int lastEnd = -1;
for (Session evt : events) {
if (evt.start >= lastEnd) {
selected++;
lastEnd = evt.end;
}
}
System.out.println(selected);
}
}
Frequency-Based Sorting for Bounded Ranges
Sorting a massive collection of integers where the value range is significantly smaller than the collection size should avoid comparison-based algorithms like Quicksort or Shellsort. Instead, a counting approach operates in linear time relative to the input size and value range.
For election tallying or similar voting problems, mapping each candidate ID to an index in a frequency array allows $O(1)$ updates per vote. Reconstructing the sorted output requires iterating through the frequency array and appending values to a StringBuilder to avoid excessive I/O flush operations.
import java.io.*;
public class VoteTally {
public static void main(String[] args) throws IOException {
FastInput in = new FastInput();
int candidates = in.nextInt();
int votes = in.nextInt();
int[] counts = new int[candidates + 1];
for (int i = 0; i < votes; i++) {
counts[in.nextInt()]++;
}
StringBuilder sb = new StringBuilder(votes * 2);
for (int id = 1; id <= candidates; id++) {
int freq = counts[id];
while (freq-- > 0) {
sb.append(id).append(' ');
}
}
System.out.println(sb);
}
}
Iterative Binary Search for First Occurrence
Locating the first occurrence of a target value in a monotonically non-decreasing array requires a modified binary search. Recursive implementations introduce unnecessary stack overhead and can complicate index tracking. An iterative lower-bound approach is both memory-efficient and faster.
The loop maintains a search range [left, right). When arr[mid] >= target, the right boundary contracts to mid, preserving the potential first occurrence. Otherwise, the left boundary advances. Fast I/O remains critical when handling $10^5$ queries against $10^6$ elements.
import java.io.*;
public class FirstIndexFinder {
public static void main(String[] args) throws IOException {
FastInput in = new FastInput();
int n = in.nextInt();
int m = in.nextInt();
int[] data = new int[n];
for (int i = 0; i < n; i++) data[i] = in.nextInt();
StringBuilder out = new StringBuilder();
while (m-- > 0) {
int query = in.nextInt();
out.append(findLowerBound(data, query)).append(' ');
}
System.out.println(out);
}
private static int findLowerBound(int[] arr, int key) {
int left = 0, right = arr.length - 1;
while (left < right) {
int mid = left + (right - left) / 2;
if (arr[mid] >= key) {
right = mid;
} else {
left = mid + 1;
}
}
return arr[left] == key ? left + 1 : -1;
}
}
Linear Maximum Subarray Sum
The maximum contiguous subarary sum problem is efficiently solved using Kadane's algorithm, which operates in $O(N)$ time and $O(1)$ space. The core logic maintains a running sum that resets to the current element whenever the accumulated sum becomes detrimental (i.e., negative).
Initialization errors often cause failures on edge cases, particularly when all input values are negative. Setting the initial maximum to Long.MIN_VALUE or the first element ensures correctness across all valid inputs.
import java.io.*;
public class MaxContiguousSum {
public static void main(String[] args) throws IOException {
FastInput in = new FastInput();
int size = in.nextInt();
long currentRun = 0;
long globalMax = Long.MIN_VALUE;
for (int i = 0; i < size; i++) {
int val = in.nextInt();
currentRun = Math.max(val, currentRun + val);
globalMax = Math.max(globalMax, currentRun);
}
System.out.println(globalMax);
}
}
Unbounded Knapsack with Large Value Constraints
In complete knapsack scenarios where items can be selected multiple times, a one-dimensional DP array updated in forward order correctly models the state transitions. The recurrence dp[j] = max(dp[j], dp[j - weight] + value) allows unlimited usage of each item within the capacity limit.
A frequent oversight involves integer overlfow. When total values exceed $2^{31}-1$, the DP array must utilize long. Failing to upgrade the data type results in incorrect outputs on large test cases despite logically sound transitions.
import java.io.*;
public class HerbHarvest {
public static void main(String[] args) throws IOException {
FastInput in = new FastInput();
int capacity = in.nextInt();
int types = in.nextInt();
long[] dp = new long[capacity + 1];
for (int i = 0; i < types; i++) {
int timeCost = in.nextInt();
int value = in.nextInt();
for (int t = timeCost; t <= capacity; t++) {
dp[t] = Math.max(dp[t], dp[t - timeCost] + value);
}
}
System.out.println(dp[capacity]);
}
}
High-Performance Input Parsing in Java
Standard Scanner tokenization relies on regular expressions, introducing significant latency for competitive programming workloads. java.io.StreamTokenizer combined with BufferedReader provides a lightweight alternative that parses numeric tokens directly from the character stream.
The tokenizer splits input based on whitespace and numeric boundaries. Calling nextToken() advances the stream, while nval retrieves the parsed double representation, which can be safely cast to int or long for algorithmic use. Wrapping this logic in a dedicated utility class standardizes fast I/O across multiple problem solutions.
import java.io.*;
class FastInput {
private final StreamTokenizer tokenizer;
public FastInput() {
tokenizer = new StreamTokenizer(new BufferedReader(new InputStreamReader(System.in)));
}
public int nextInt() throws IOException {
tokenizer.nextToken();
return (int) tokenizer.nval;
}
}