The Truth About Facebook

Ask this expert on data science what Congress should have asked Mark Zuckerberg last week, and she’ll say, nothing.

If that surprises you, please understand that these hearings are not designed to deliver the truth. They’re designed to tell the public that, if there’s a problem, the government is doing something about it. So you’re not gonna get much straight talk. It’s not that no one cares. But the problem is bigger than Facebook or the United States, as we’ve explained before.

And people have a right to know.

Now, I’m no data scientist, but I spend a lot of time on these issues in my line of work, so here are some basic truths you should know if they haven’t sunk in yet. I give the same advice to clients whenever it comes up.

  1. You are not the customer. You are the product. Their business is to derive as much information about you as possible and sell it to others. Governments like this, too, because it’s a great way to study people, keep tabs on them, and even manipulate them. No one will stop doing this for the foreseeable future.
  2. You should assume that you create a permanent record of everything you do on your phone or the internet. You can’t avoid that by logging out of an app or even deleting it. The only way may be to give up electronic devices altogether and live off the grid. And good luck with that.
  3. Your friends and family don’t decide what you see when you log in. Facebook does. Or whichever other company does. Obviously, they want to show you what they think you want to see so you’ll spend more time on their platform. But they can also manipulate what you see—or even what you think you want to see.

Welcome to the 21st century. It’s no wonder Americans are throwing up their hands over privacy. But at least we can still debate and, hopefully, decide how we want to live in the United States. The same does not apply around the world.

 

The Microsoft-Ireland Case is Moot

We covered this before here and here.

But the high-profile case of United States v. Microsoft Corporation is now over. The question was whether the government could force Microsoft to turn over data that it stored on servers in other countries. The problem was that federal law didn’t allow the government to do that, or at least it wasn’t clear. So Microsoft challenged the government. It lost in the trial court, won on appeal, and landed before the U.S. Supreme Court last fall.

The Supreme Court heard the case in February, but in March, the new federal spending bill changed everything. It included a law called the CLOUD Act, which stands for Clarifying Lawful Overseas Use of Data.

Under the new law, companies like Microsoft must produce data in their possession, custody, or control even if it’s located outside the United States. They may object if they reasonably believe that a target is not from the U.S. and that, by producing the data, they will break the laws of a “qualifying foreign government.” That means a government with which the U.S. has agreed to grant reciprocal access to such data for use in criminal cases. But there are no agreements yet. To get there, a foreign government must, among other things, not target U.S. persons, and it must be committed to due process, data privacy, free speech, and other civil rights and liberties.

Within a week of the new law, the U.S. moved to dismiss the case as moot.

A few days later, Microsoft agreed. For the full text of the CLOUD Act, see here, and scroll all the way down to page 866. For the company’s detailed statement on the new law, see here.

And so the trilogy is complete.

SEC Chair Offers Advice on Bitcoin and Its Ilk

This week, the chair of the U.S. Securities and Exchange Commission weighed in on crypto-currencies as well as ICOs or initial coin offerings. With the price of bitcoin nearing $20,000, it probably comes at the right time. You may have been wondering yourself: What are the rules for this stuff? Are they being followed? And what are the risks in these markets?

Here is a summary of his advice for both Main Street and Wall Street.

For Main Street

These are the folks at home who may be tempted to jump on the bandwagon.

  1. Understand that, for now, it’s the Wild West out there. The SEC hasn’t approved any crypto-currency-related funds or products for listing and trading, and no one has registered an ICO with the Commission. Don’t let anyone today tell you otherwise.
  2. Do your homework. If you choose to invest in these things, ask plenty of questions and demand clear answers. The Chair’s statement includes a list of sample questions to consider. Be especially careful if a pitch sounds too good to be true or you’re pressured to act quickly.
  3. Understand that these markets cross borders, so your money may travel overseas even without your knowledge. Once there, you may not ever be able to get it back.

For Wall Street

These are market professionals like brokers, dealers, lawyers, advisers, accountants, and exchanges.

  1. Although ICOs can be effective ways to raise money, you have to follow the securities laws if it constitutes an offering of securities. So ask yourself: Is this offering a security? Is it an investment contract? Is it, in other words, an investment of money in a pooled venture that expects to derive profit from the efforts of others? If you’re not clear on this then you need a lawyer because the Commission will look past the form of a transaction to its substance. So just calling it a currency doesn’t settle the question. We blogged recently about this fact-intensive inquiry here.
  2. If you handle transactions in crypto-currency, you should treat them as if cash were being handed from one party to the other. You should know your customer and mind anti-money-laundering laws whenever you allow payments in crypto-currencies, allow their purchase on margin, or otherwise use them to facilitate securities transactions.

They May Be Intelligent, But Are They Wise?

Speaking of fair shakes, here is a wise word of caution about the emerging, expanding use of computer programs to evaluate people in the justice system, whether at bail hearings, sentencings, or elsewhere.

The author is a former software engineer at Facebook who’s now studying law at Harvard. Her point isn’t that we shouldn’t use or consult these programs, but we should know what we’re getting into and proceed with caution. It’s troubling, for example, if we use programs that no one in the field fully understands—not judges, not lawyers, not probation—because the manufacturer won’t disclose a proprietary algorithm.

She says we turn to computers in part to control for our own biases, “[b]ut shifting the … responsibility to a computer doesn’t necessarily eliminate bias; it delegates and often compounds it.” That’s because these programs mimic the data we use to train them, so even the ones that accurately reflect our world will necessarily reflect our biases. Plus, they work on a feedback loop, so if they’re not constantly retrained, they lean in toward those biases and drift even further from reality and fairness. So they don’t just parrot our own biases; they amplify them. She saw this phenomenon time and again as a software engineer.

She agrees that algorithms can work for good. They’ve reportedly helped New Jersey reduce its pretrial jail population, for example.

But let’s proceed with caution, she says:

“Computers may be intelligent, but they are not wise. Everything they know, we taught them, and we taught them our biases. They are not going to un-learn them without transparency and corrective action by humans.”

The Modern Public Square

This week brought us another unanimous U.S. Supreme Court case that’s arguably more important because it concerned the First Amendment.

The issue was a North Carolina law that made it a felony for registered sex offenders to use any social-networking site that let minors join. So, to be clear: that’s any social-media site, period, that let minors join. That meant Facebook, LinkedIn, Twitter, or pretty much any other social-media site. The law was even broad enough to include websites like Amazon, WebMD, and the Washington Post. So you almost couldn’t use the Internet.

The defendant was one of more than 1,000 people who’ve been prosecuted under the law. In 2002, when he was 21 years old, he had sex with a 13-year-old girl, and he was charged with it. He pleaded guilty to it and registered as a sex offender. Then the law passed in 2008.

In 2010, he happened to get a traffic ticket dismissed in court, whereupon he logged on to Facebook and posted this to his timeline: “Man God is Good! How about I got so much favor they dismissed the ticket before court even started? No fine, no court cost, no nothing spent … Praise be to GOD, WOW! Thanks JESUS!”

He was indicted for that.

He moved to dismiss on the ground that the law violated the First Amendment, but the trial court denied it. He was convicted at trial and given a suspended prison sentence.

On appeal, the state courts duked it out. The court of appeals agreed with the guy, finding that the law violated the First Amendment. But the state supreme court reversed, finding the law “constitutional in all respects.”

Finally, the federal high court unanimously struck down the law because it plainly applied to websites like Facebook, LinkedIn, and Twitter among others. Facebook itself had 1.79 billion active users—or three times the population of North America.

The Court called these sites “integral to the fabric of our modern society and culture.” They had become our main sources for sharing current events, participating in the public square, and exploring human thought and knowledge. To foreclose access to them was to foreclose the legitimate exercise of First-Amendment rights.

Yes, a state could pass specific, narrowly-tailored laws that regulate the type of conduct that portends crime, like contacting a minor or using a website to gather information about one.

But it couldn’t just cut people off from the public square.

Can They Search My Phone at the Border?

Suppose you go to visit your aunt in Italy, and you take your phone and tablet with you.

When you come back through customs, can they just search your devices willy nilly?

Probably. Here’s a good overview of your rights at the border, along with some practical considerations. It’s worth reading ahead of time because the government is stepping up its enforcement at points of entry, and there have been some heavy-handed run-ins lately between agents and travelers, including U.S. citizens.

The general rule is that customs and border agents may conduct routine, reasonable searches of you and your belongings, including your electronic devices, for any reason or no reason at all. They don’t need a warrant, and they don’t need any basis to believe they’ll find evidence of a crime. It’s known as the Fourth Amendment’s border-search exception.

But how far can they go?

Can they conduct full, forensic searches or force you to give up your passwords?

According to this 2009 policy memo, the answer is yes. It says agents can seize your device, copy its contents, and search them. To do so, they can hold a device for up to five days with a supervisor’s approval. For longer periods, they must get further approval at higher levels. Ordinarily, they must conduct the search in the presence of a supervisor, too, but if that’s not feasible, they must inform a supervisor about their search as soon as possible. If they find probable cause to believe your phone contains evidence of a crime, you may not get it back for a while, if at all. If they don’t, you should get your phone back eventually, and they’re supposed to destroy any copied information.

The law is evolving, however, to require at least a reasonable suspicion for a full forensic search. That’s already the case in the federal circuit that covers California and eight other states, and the law should continue to trend in that direction. What is a reasonable suspicion? It’s a particularized and objective basis for suspecting someone of a crime.

Still, reasonable suspicion is not a tough legal standard to meet.

Plus, agents can always just ask you to unlock your phone or give up your passwords, and if you refuse, they have plenty of ways to coerce you. They can take your phone; detain you, too; search your bags more thoroughly; deny you entry if you’re visiting; or scrutinize your green-card status. Most folks just want to be on their way.

So happy trails, traveler. Leave the phone, perhaps, but take the cannoli.

The Future of Face-Recognition Technology

Face it: the future is already here. And by default, your face is ever more likely to be found in a law-enforcement database. It’s as easy as getting a driver’s license.

The facts are that face recognition is neither new nor rare, and more than one out of two American adults have already been loaded into a local, state, or federal database.

That’s according to this report by the Center on Privacy and Technology at the Georgetown University Law Center. Read it to learn more about this technology; how it’s being used; and what the future holds. For three shorter stories about it, see here, here, and here.

What did the researchers do? They sent public-records requests to more than one hundred law-enforcement agencies across the country. They interviewed representatives from dozens of those agencies as well as from the technology companies they contract with. They made two site visits to agencies that use advanced face-recognition systems. And they surveyed the state of the law (or lack thereof) in all fifty states.

What are their takeaways? Here are four.

  1. The technology has value, and its use is inevitable. The report doesn’t aim to stop it.
  2. Its use is spreading rapidly and secretly without limits, standards, or public oversight.
  3. The total network of federal, state, and local databases includes over 117 million American adults. That’s more than half the country.
  4. We’re moving toward a world of continuous, real-time face recognition through public surveillance cameras.

What are their recommendations? Here are three.

  1. Congress and state legislatures should pass commonsense laws to regulate face recognition, and police should follow them before they run a search.
    • For example, to search a database of driver’s license or state identification photos, police should have a warrant backed by probable cause.
    • To search a database of mug shots, they should have a reasonable suspicion of criminal conduct. Periodically, they should scrub the database of people who were arrested but not charged and convicted. Michigan, for one, already requires that.
    • They should not use real-time, continuous surveillance except for public emergencies.
    • They should not track people based on politics, religion, or other protected status.
  2. The federal government should develop tests and best practices to improve the technology’s accuracy. For example, in the latest available test of the FBI’s database, the system included the right person on a list of fifty potential matches 86% of the time. That means that one out of seven searches returned a list of fifty innocent look-alikes, and the other six included 49 of them.
  3. All governments should report their use of the technology, audit such use regularly, and respect civil rights and liberties.

Man Gets Indicted By His Pacemaker

Actually, the case was indicted by a grand jury in Ohio, which charged him with arson and insurance fraud.

Apparently, the man called 911 as his home burned in the background. He said he was sleeping when the fire started and that, in a hurry, he packed a bunch of bags, broke a window with his cane, threw the bags out the window, and carried them away. He mentioned that he had a pacemaker.

The police came to suspect him of arson. They say they found gasoline on his shoes, pants, and shirt, and they believe the fire had multiple points of origin from outside the house.

So they got a search warrant for the data from his pacemaker. That gave them a historical record of his heart rate and rhythms before, during, and after the fire.

Reportedly, the data showed that the man was active when he was supposed to be asleep, and a cardiologist has said it was “highly improbable” that he could carry out the strenuous activities he described.

SEC Reports Enforcement Results for 2016

As we wind down the calendar year, the Securities and Exchange Commission has already reported its enforcement results for the fiscal year that ended September 30.

In case you missed it, here’s the press release. Naturally, there’s some self-patting on the back, but if the past predicts the future, the agency is looking to file cases. Its numbers have climbed steadily over the last dozen years, and it continues to ramp up its use of big-data analytics and the whistleblower program, which it launched in 2011.

Here are some highlights from 2016.

  • The agency filed a total of 868 cases, which was a new single-year high.
  • It filed a record number of cases involving investment companies or advisers and a record number under the Foreign Corrupt Practices Act.
  • It obtained over $4 billion in judgments and orders, which matched the haul from each of the last two years.
  • It awarded more money to whistleblowers ($57 million) than in all prior years combined.

If At First You Don’t Succeed

Here’s that DUI case we alluded to last week.

It’s based on a driver’s challenge to his license suspension after his arrest. His post-arrest blood test showed a blood-alcohol concentration (or BAC) of 0.23 percent. He challenged this finding at the DMV’s administrative hearing and lost. He then petitioned the superior court to overturn that finding and lost again.

After twice losing before the agency and the trial court, he took another swing in the court of appeal, and there, he won.

The issue was whether his blood-test result was reliable.

The crime lab had tested his sample using a machine called a gas chromatograph. It has a heated chamber with two columns through which a sample is passed in gaseous form, and therein lies the rub. You’ve got to use both of those columns. One isn’t enough. Otherwise, you may get a false positive or the machine may indicate more alcohol than actually exists.

According to the driver’s expert and even the machine’s own manufacturer, one column could “tentatively identify” alcohol but “simply [could not] confirm its identity” or “how much might be present.”

In this case, the lab used the right machine, but the test results showed data from only one column, and the DMV didn’t offer any proof to show otherwise.

Thus the DMV could not rely on the test results because, as a matter of scientific principle, one column’s result was incapable of establishing the driver’s BAC.

And so the court of appeal reversed.

Ratings and Reviews

10.0Mani Dabiri
Mani DabiriReviewsout of 7 reviews
The National Trial Lawyers
The National Trial Lawyers
Mani Dabiri American Bar Foundation Emblem