Sort:  

I've heard that argument that raising wages would cause American citizens to take up these jobs.

While it sounds logical, there's nothing in American history that says this will happen, that citizens would take these jobs.

What was America founded upon? Slavery. They wanted to enslave the indigenous people, but when that didn't work out, they enslaved Africans.

They always wanted an underclass.

When African Americans were freed, did they raise wages so white citizens would take the jobs? No. They imported coolie labor from Ireland and China. These men were under debt bondage, if not by law, then by the fact they faced racism.

Then the Irish spearheaded attacks on Chinese, and eventually got a law to prohibit Chinese immigration. But, Asian contract labor continued, from Japan, Korea, and the Philippines, at different times, and each eventually excluded from immigration.

Then, when the economy went bad in the Depression, they deported Mexicans and Mexican Americans who were citizens.

When the demand for labor came back, they created a guest worker program called the Braceros, which was a virtual kind of slavery. They even got ripped off. Japanese Americans who had been imprisoned in concentration camps were also "allowed" to work on farms, for a fraction of what free people earned.

Then, after that, we had the "illegal immigrant" period, which, compared to the previous periods of labor importation that resembled slavery, was a period of relative freedom.

We also changed the economy a bit to use more imported goods.

Now, we're headed back into an exclusionary period, and it feels like fascism.

There were short periods during this history, when wages went up, and people could get some money doing farm labor, cleaning houses, doing handyperson work, killing animals for food, and so forth, doing exactly the same jobs that slaves did in the early days of the US. Those periods didn't last.