A man raises his hand during a meeting at Google offices in New York. People should have some say over the results that pop up when they conduct a search of their own name online, Europe’s highest court said Tuesday.

AMSTERDAM — Google and other search engines were thrust into an unwanted new role Tuesday — caretaker of people’s reputations — when Europe’s highest court ruled that individuals should have some say over what information pops up when their names are Googled.

The landmark ruling by the Court of Justice of the European Union will force search engines to decide when to censor computer users’ search results across the 28-nation bloc of over 500 million people.

The decision — which cannot be appealed — was celebrated by some as a victory for privacy rights in the Internet age. Others warned it could lead to online censorship.

The ruling applies to EU citizens and all search engines in Europe, including Yahoo and Microsoft’s Bing.

It has no immediate impact on the way Google and other search engines display their results in the U.S. or other countries outside Europe.

But it could create logistical headaches for such companies by forcing them to make judgment calls about the fairness of information published on other websites.

In its ruling, the EU court said search engines must listen and sometimes comply when people ask for the removal of links to newspaper articles or other sites containing outdated or otherwise objectionable information.

Google Inc. has long maintained that people with such complaints should take it up with the websites that posted the material.

“This is a disappointing ruling for search engines and online publishers in general,” the Mountain View, California, company said in a statement.

Though Europe is one of Google’s biggest markets, the decision isn’t expected to have much effect on the company’s earnings. That’s because it has no direct bearing on the online ads that Google places alongside its search results.

It’s unclear exactly how the European court envisions Google and others handling complaints.

Google, though, has dealt with similar situations in the past.

The company already censors some of its search results in several countries to comply with local laws. For instance, Google and other search engines are banned from displaying links to Nazi paraphernalia and certain hate speech in Germany and France.

The company also has set up a process so people can have their images blurred if they appear in Google’s street-level photographic maps.

What Google and other search engines have sought to avoid is acting as the arbiters of what kind of information to include in their searches.

These companies rely on formulas, or algorithms, and automated “crawlers” that roam the Internet and gather up results in response to search requests.

“There’s not much guidance for Google on how to figure out how and when they are supposed to comply with take-down requests — they just know they have to weigh the public interest,” said Joel Reidenberg, a Fordham University law professor now visiting Princeton University.

The case was referred to the European Court from Spain’s National Court, which asked for advice in the case of Mario Costeja, a Spaniard who found a search on his name turned up links to a notice that his property was due to be auctioned because of an unpaid welfare debt. The notice had been published in a Spanish newspaper in 1998, and was tracked by Google’s robots when the newspaper digitized its archive.

Costeja argued that the debt had long since been settled, and he asked the Spanish privacy agency to have the reference removed. In 2010, the agency agreed, but Google refused and took the matter to court, saying it should not be asked to censor material that had been legally published by the newspaper.