#aifacialrecognition

petapixel (unofficial)petapixel@ծմակուտ.հայ
2021-08-17

Intel is Dropping Development of its AI-Powered RealSense Cameras

Intel's RealSense cameras were originally designed for touch-free interactions but pivoted to specifically facial recognition in January. Today, Intel confirmed that it is "winding down" its high-tech camera and sensor development to focus on its core chip business.

In a statement to CRN, Intel says that it has decided to pull the plug on camera and sensor development. While perhaps unexpected, Engadget points out that the RealSense team's leader Sagi Ben Moshe announced he was leaving Intel two weeks ago.

"We are winding down our RealSense business and transitioning our computer vision talent, technology and products to focus on advancing innovative technologies that better support our core businesses and IDM 2.0 strategy," an Intel spokesperson said in an emailed statement to CRN. "We will continue to meet our commitments to our current customers and are working with our employees and customers to ensure a smooth transition."

RealSense was originally pitched to prospective customers as a fast, easy way to build products that were equipped with computer vision. The RealSense line consisted of stereoscopic, LiDAR, and coded light cameras and camera modules that would be able to support high frame rates and high resolutions in various form factors.

Credit: Intel Corporation

Eventually, the RealSense team pivoted to focus specifically on facial recognition in a way that is most easily compared to Apple's FaceID technology.

“Intel RealSense ID combines active depth with a specialized neural network, a dedicated system-on-chip and embedded secure element to encrypt and process user data quickly and safely,” the company said in January, also promising that it would work just as fast as customers had become accustomed to with Apple Face ID.

Intel was selling its RealSense ID as cheaply as $99 for a module or in packs of 10 for $750. Intel began shipping units in March, but with this announcement will only complete current orders and not be accepting any new ones as it winds down the business that has only been selling units for five months.

CRN spoke with Kent Tibbils, the Vice President of Marketing at ASI, a Fremont, California-based distributor of RealSense products. He told CRN that while he wasn't aware of the plan to shutter the entire RealSense business, it did not come as a surprise to him as there were only a few customers who were buying small numbers of units. He says that it was a niche product that was very specialized and therefore wasn't moving quickly or in large numbers.

Engadget notes that this decision makes sense from the perspective of Intel's stated goals. The company's CEO Pat Gelsinger has said that he wants Intel to reclaim the chipmaking crown, a battle that it has been losing to AMD. To that end, the company is shifting its focus and resources to that core set of goals and the RealSense tech which wasn't moving the needle much is a byproduct of that decision.

#news #technology #aifacialrecognition #apple #depthsensing #faceid #facialrecognition #intel #intelrealsense #realsense

image
petapixel (unofficial)petapixel@ծմակուտ.հայ
2021-07-15

Facial Recognition Misidentifies Black Teen, Ignites Debate Over its Ethics

A Detroit-area skating rink is under fire for barring entry to a Black teenager after its facial recognition cameras misidentified her as a woman who was banned from the property. It has further ignited debate on the ethics of using facial recognition technology in the United States.

14-year old Lamya Robinson was barred from entering Riverside Arena -- located in Livonia, Michigan -- after the skating rink's facial recognition cameras determined that she was a woman who had been involved in a "brawl" there previously. She and her parents, Juliea and Derrick, say that not only is the woman in the software not their daughter, she had never been to the skating rink before and was at home when the brawl occurred.

"I was so confused because I've never been there," the young girl said. "I was like, that is not me. Who is that?"

The situation touches on the ethics of using facial recognition technology and underpins an issue with how cameras have historically had difficulty properly recognizing facial features and exposing for darker skin tones. The issue of exposure has been present dating back to even film photography. Some companies, like Google, are trying to make adjustments to fix the problem, but it's not something that has been widely addressed across the imaging industry.

"To me, it's basically racial profiling," Juliea Robinson told Fox 2 News Detroit. "You're just saying every young Black, brown girl with glasses fits the profile and that's not right."

Fox 2 News Detroit

Lamya Robinson's experience is not the first instance facial recognition cameras failed to properly identify a Black person. In June of 2020, Robert Julian-Borchack Williams of Farmington Hills, Michigan was falsely identified as a man who authorities say stole $3,800 worth of merchandise from a Detroit Shinola retail store. When detectives ran security footage through facial recognition software, it pointed to Williams as a the suspect.

Police arrested Williams at his home, placed him in an interrogation room, and put the three photos from the security footage in front of him.

"When I look at the picture of the guy, I just see a big Black guy. I don't see a resemblance. I don't think he looks like me at all," Williams said in an interview with NPR.

Williams was detained for 30 hours before being released on bail. Charges against him were eventually dropped due to insufficient evidence. According to NPR, civil rights experts say Willliams' experience is the first example in the United States of someone being wrongfully arrested based on incorrect facial identification technology.

On July 13, 2021, Williams took to Capitol Hill to testify before the House Judiciary Subcommittee on Crime, Terrorisim, and Homeland Security which is currently looking into the use of facial recognition cameras by law enforcement, including "risk to civil liberties and due process posed by this technology," according to Detroit News. Williams told his story and expressed his desire to see Congress act.

The video below is time-stamped for 38:50, which is the start of Williams' testimony.

Both Williams' and Lamya Robinson's case are being pointed to by activists who are calling for retailers not to use facial recognition on customers or workers in their stores. Tawana Petty, who heads up Data 4 Black Lives and is one of those activist organizations, says that because of the problems with how cameras see darker skin tones, the use of the technology can cause significant harm.

"I don't want to go to Walmart and be tackled by an officer or security guard because they misidentified me for something I didn't do," she told Fox 2 News.

According to the organizations against the use of facial recognition, Lowes and Macy's are among those who already use facial recognition in stores while Walmart, Kroger, Home Depot, and Target are among those who do not but may be considering it.

Clearly, the technology is already in use in independent businesses as well, such as the Riverside Arena skating rink. The Robinson family is considering legal action against the business.

"One of our managers asked Ms. Robinson (Lamya's mother) to call back sometime during the week. He explained to her, this is our usual process, as sometimes the line is quite long and it's a hard look into things when the system is running," the Riverside Arena skating rink said in a statement to Fox 2 News. "The software had her daughter at a 97 percent match. This is what we looked at, not the thumbnail photos Ms. Robinson took a picture of, if there was a mistake, we apologize for that."

#culture #law #news #ai #aicameras #aifacialrecognition #artificialintelligence #civilliberties #civilrights #detroit #ethical #ethics #facialrecognition #facialrecognitioncameras #legal #wrongfularrest

image
petapixel (unofficial)petapixel@ծմակուտ.հայ
2021-04-22

‘FaceBlocker’ Tech Prevents Clients from Using Screenshots of Proofs

Waldo Photos has launched a new AI-Powered Mobile Sales Platform that combines with a tech called FaceBlocker. Together, the platform allows photographers to copyright protect proofs, prevent theft, and maximize sales opportunities.

When combined with the companies mobile sales platform, the new FaceBloacker technology addresses several key protection issues professionals often run across.

Waldo Photos is a platform for photographers that allows them to easily share photos with their online community via automated proof delivery, facial recognition, jersey recognition, and AI-Powered sorting. The launch of WaldoPro adds some interesting additional features to the platform

  • Photomanager - an AI-powered SaaS platform for hosting, managing and publishing photos with advanced analytics
  • Sell Photos - a mobile sales platform leveraging Faceblocker copyright protection, text-based proof delivery, mobile app ordering process, and drop ship print delivery.
  • Share Photos - automated mobile delivery platform for event photography
  • Member Connect - tools for marketing and remarketing, including personalized direct mail and a text-based communication platform.

The AI-driven system provides better engagement and discovery by potential clients as their proofs are delivered via SMS alerts, giving the photographer a 100 percent contactless sales model and options to sell additional images even months after the initial shoot. Tack on the FaceBlocker service that allows the photographer to deliver easily accessible and good quality proofs on the client's smartphone, while making them difficult to screenshot/copy instead of purchasing and the company has a pretty robust automated and protected sales system.

Waldo Photo says that over the last year, the FaceBlocker technology has been used by professional photographers at national dance and cheer competitions, Miss America, Miss USA, National Gymnastics Competitions, and more. Additionally, photographers who have adopted this system have reported seeing an increase in after-event sales of over 100%. Lofty claims, but the system seems designed to funnel a powerful sales channel, so it's not particularly surprising.

FaceBlocker takes the images uploaded to Waldo Photo and places the Waldo logo over the facial-recognized face of the potential purchaser. The FaceBlocked proofs are then delivered to the mobile device of the client who can then tap their face on the screen to remove the logo. However, when the client's face is revealed, the rest of the photo is blurred. This process makes it easy for the client to get an idea of what the whole image will look like, but nearly at the same time makes it nearly impossible for them to save it or screenshot it without placing a legitimate purchase order.

Once the client has purchased an image, the photos can be delivered through the Waldo platform without any obstruction or blurs, and if they order prints, those can be shipped directly to the purchaser which Waldo says saves the photographer from dealing with hours of back and forth with the print labs and shipping companies.

The FaceBlocker tech alone is an unusual and likely highly effective way of monetizing every photo and combined with Waldo's interface the whole platform seems tailor-made to help a photographer make the most from their work by addressing actual modern problems they face daily.

Waldo offers a demo and a 30-day trial via its website for those interested in seeing if the platform is a fit for their business.

#news #software #technology #ai #aifacialrecognition #aipowered #artificialintelligence #photomanagementplatform #sales #salesplatform #waldo #waldophotomanager #waldopro

image

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst