Next time you’re in Washington, stop by the National Archives and take a look: There, in Article VI, the Constitution specifies that treaties are “the supreme Law of the Land; and the judges in every state shall be bound thereby, anything in the Constitution or laws of any state to the contrary notwithstanding.”
The supremacy of international treaties over state laws was bedrock principle for 150 years, observes David L. Sloss, a professor of law at SCU. But as his new book makes clear, something happened: The Death of Treaty Supremacy: An Invisible Constitutional Change (Oxford University Press) traces the trouble to 1945 and the signing of the UN Charter, which includes the requirement to uphold “human rights … for all without distinction as to race.”
Five years later, a California state court used that charter and the supremacy clause to overturn a law discriminating against Japanese nationals. Implications for Jim Crow laws on the books were clear. Conservatives proposed a constitutional amendment to invalidate the supremacy clause. Liberals argued that an amendment wasn’t necessary; the clause was optional, they said—only treaties that stipulated how they would be executed (self-executing) fell under the supremacy clause. It has been interpreted thus ever since. “The optional supremacy rule impairs the president’s ability to conduct foreign policy,” Sloss notes. President George W. Bush learned that firsthand; in Medellín v. Texas, he ordered Texas to comply with U.S. treaty obligations. The Supreme Court ruled that Texas didn’t have to; the treaty provision in question was non-self-executing.
The result: “Texas subverted U.S. compliance with a treaty obligation that binds the entire nation.” For those paying attention to original intent, Sloss says, that’s precisely what the Framers thought they were trying to avert. After all, it was a treaty (Paris, 1783) that gave us a country.