A cut is a set of edges that, if removed, would disconnect the Graph; a minimum cut is the smallest possible set of edges that, when removed, produce a disconnected Graph. Every minimum cut corresponds to a partitioning of the Graph vertices into two non-empty subsets, such that the edges in the cut only have their endpoints in the two different subsets.
Karger algorithm builds a cut of the graph by randomly creating these partitions, and in particular by choosing at each iteration a random edge, and contracting the graph around it: basically, merging its two endpoints in a single vertex, and updating the remaining edges, such that the self-loops introduced (like the chosen edge itself) are removed from the new Graph, and storing parallel-edges (if the algorithm chooses an edge (u,v) and both u and v have edges to a third vertex w, then the new Graph will have two edges between the new vertex z and w) After n-2 iterations, only two macro-vertex will be left, and the parallel edges between them will form the cut.
The algorithm is a Montecarlo algorithm, i.e. its running time is deterministic, but it isn't guaranteed that at every iteration the best solution will be found.
Actually the probability of finding the minimum cut in one run of the algorithm is pretty low, with an upper bound of 1/(n ^ 2), where n is the number of vertices in the Graph. Nonetheless, by running the algorithm multiple times and storing the best result found, the probability that none of the runs founds the minimum cut becomes very small: 1/e (Neper) for n squared runs, and 1/n for n ^2 * log(n) runs - for large values of n, i.e. for large Graphs, that's a negligible probability.
The implementation provided is written in Python, assumes the Graph represented with adjacency list (as a Dictionary) and is restricted to having only integer vertices labels (ideally the number from 0 to n-1): this limitation allows to exploit the union-find implementation provided, and can be easily overcome by mapping the original labels to the range [0..n-1].