Новостная лента: мобильные технологии

Меню

Архив метки: SMS

Airship announced today that it has acquired Apptimize, an A/B testing company whose customers include Glassdoor, HotelTonight and The Wall Street Journal.
Formerly known as Urban Airship, the more concisely named Airship has built a platform for companies to manage their customer communication across SMS, push notifications, email, mobile wallets and more.
It says that by acquiring Apptimize, it can help customers test the impact of their messages. That means integrating Apptimize’s testing capabilities into the Airship platform, but the company says it will also continue to support Apptimize as a standalone platform.
“By combining Apptimize mobile app and web testing with Airship’s deep insight into customer engagement across channels, marketers and developers can focus innovation on the most critical areas while creating the seamless end-to-end experiences customers really want,” said Airship president and CEO Brett Caine in a statement.
The financial terms of the acquisition were not disclosed. Apptimize had raised a total of $18.6 million from US Venture Partners, Costanoa Ventures and others, according to Crunchbase.
Airship says it will be bringing over 19 Apptimize team members (a little less than two-thirds of the startup’s total workforce) across engineering, customer service and sales.

Facebook today released a new SDK that allows mobile app developers to integrate WhatsApp verification into Account Kit for iOS and Android. This will allow developers to build apps where users can opt to receive their verification codes through the WhatsApp app installed on their phone instead of through SMS.
Today, many apps give users the ability to sign up using only a phone number — a now popular alternative to Facebook Login, thanks to the social network’s numerous privacy scandals that led to fewer people choosing to use Facebook with third-party apps.
Plus, using phone numbers to sign up is common with a younger generation of users who don’t have Facebook accounts — and sometimes barely use email, except for joining apps and services.
When using a phone number to sign in, it’s common for the app to confirm the user by sending a verification code over SMS to the number provided. The user then enters that code to create their account. This process can also be used when logging in, as part of a multi-factor verification system where a user’s account information is combined with this extra step for added security.

While this process is straightforward and easy enough to follow, SMS is not everyone’s preferred messaging platform. That’s particularly true in emerging markets like India, where 200 million people are on WhatsApp, for example. In addition, those without an unlimited messaging plan are careful not to overuse texting when it can be avoided.
That’s where the WhatsApp SDK comes in. Once integrated into an iOS or Android app, developers can offer to send users their verification code over WhatsApp instead of text messaging. They can even choose to disable SMS verification, notes Facebook.
This is all a part of WhatsApp’s Account Kit, which is a larger set of developer tools designed to allow people to quickly register and log in to apps or websites using only a phone number and email, no password required.
This WhatsApp verification codes option has been available on WhatsApp’s web SDK since late 2018, but hadn’t been available with mobile apps until today.

On feed-based “broader social networks, where people can accumulate friends or followers until the services feel more public . . . it feels more like a town square than a more intimate space like a living room” Facebook CEO Mark Zuckerberg explained in a blog post today. With messaging, groups, and ephemeral stories as the fastest growing social features, Zuckerberg laid out why he’s rethinking Facebook as a private living room where people can be comfortable being themselves without fear of hackers, government spying, and embarrassment from old content — all without encryption allowing bad actors to hide their crimes.
Perhaps this will just be more lip service in a time of PR crisis for Facebook. But with the business imperative fueled by social networking’s shift away from permanent feed broadcasting, Facebook can espouse the philosophy of privacy while in reality servicing its shareholders and bottom line. It’s this alignment that actually spurs product change. We saw Facebook’s agility with last year’s realization that a misinformation- and hate-plagued platform wouldn’t survive long-term so it had to triple its security and moderation staff. And in 2017, recognizing the threat of Stories, it implemented them across its apps. Now Facebook might finally see the dollar signs within privacy.

The New York Times’ Mike Isaac recently reported that Facebook planned to unify its Facebook, WhatsApp, and Instagram messaging infrastructure to allow cross-app messaging and end-to-end encryption. And Zuckerberg discussed this and the value of ephemerality on the recent earnings call. But now Zuckerberg has roadmapped a clearer slate of changes and policies to turn Facebook into a living room:
-Facebook will let users opt in to the ability to send or receive messages across Facebook, WhatsApp, and Instagram
-Facebook wants to expand that interoperability to SMS on Android
-Zuckerberg wants to make ephemerality automatic on messaging threads, so chats disappear by default after a month or year, with users able to control that or put timers on individual messages.
-Facebook plans to limit how long it retains metadata on messages once it’s no longer needed for spam or safety protections
-Facebook will extend end-to-end encryption across its messaging apps but use metadata and other non-content signals to weed out criminals using privacy to hide their misdeeds.
-Facebook won’t store data in countries with a bad track record of privacy abuse such as Russia, even if that means having to shut down or postpone operations in a country
You can read the full blog post from Zuckerberg below:

Posted by Mark Zuckerberg on Wednesday, March 6, 2019

A Privacy-Focused Vision for Social Networking
My focus for the last couple of years has been understanding and addressing the biggest challenges facing Facebook. This means taking positions on important issues concerning the future of the internet. In this note, I’ll outline our vision and principles around building a privacy-focused messaging and social networking platform. There’s a lot to do here, and we’re committed to working openly and consulting with experts across society as we develop this.
—
Over the last 15 years, Facebook and Instagram have helped people connect with friends, communities, and interests in the digital equivalent of a town square. But people increasingly also want to connect privately in the digital equivalent of the living room. As I think about the future of the internet, I believe a privacy-focused communications platform will become even more important than today’s open platforms. Privacy gives people the freedom to be themselves and connect more naturally, which is why we build social networks.
Today we already see that private messaging, ephemeral stories, and small groups are by far the fastest growing areas of online communication. There are a number of reasons for this. Many people prefer the intimacy of communicating one-on-one or with just a few friends. People are more cautious of having a permanent record of what they’ve shared. And we all expect to be able to do things like payments privately and securely.
Public social networks will continue to be very important in people’s lives — for connecting with everyone you know, discovering new people, ideas and content, and giving people a voice more broadly. People find these valuable every day, and there are still a lot of useful services to build on top of them. But now, with all the ways people also want to interact privately, there’s also an opportunity to build a simpler platform that’s focused on privacy first.
I understand that many people don’t think Facebook can or would even want to build this kind of privacy-focused platform — because frankly we don’t currently have a strong reputation for building privacy protective services, and we’ve historically focused on tools for more open sharing. But we’ve repeatedly shown that we can evolve to build the services that people really want, including in private messaging and stories.
I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won’t stick around forever. This is the future I hope we will help bring about.
We plan to build this the way we’ve developed WhatsApp: focus on the most fundamental and private use case — messaging — make it as secure as possible, and then build more ways for people to interact on top of that, including calls, video chats, groups, stories, businesses, payments, commerce, and ultimately a platform for many other kinds of private services.
This privacy-focused platform will be built around several principles:
Private interactions. People should have simple, intimate places where they have clear control over who can communicate with them and confidence that no one else can access what they share.
Encryption. People’s private communications should be secure. End-to-end encryption prevents anyone — including us — from seeing what people share on our services.
Permanence. People should be comfortable being themselves, and should not have to worry about what they share coming back to hurt them later. So we won’t keep messages or stories around for longer than necessary to deliver the service or longer than people want it.
Safety. People should expect that we will do everything we can to keep them safe on our services within the limits of what’s possible in an encrypted service.
Interoperability. People should be able to use any of our apps to reach their friends, and they should be able to communicate across networks easily and securely.
Secure data storage. People should expect that we won’t store sensitive data in countries with weak records on human rights like privacy and freedom of expression in order to protect data from being improperly accessed.
Over the next few years, we plan to rebuild more of our services around these ideas. The decisions we’ll face along the way will mean taking positions on important issues concerning the future of the internet. We understand there are a lot of tradeoffs to get right, and we’re committed to consulting with experts and discussing the best way forward. This will take some time, but we’re not going to develop this major change in our direction behind closed doors. We’re going to do this as openly and collaboratively as we can because many of these issues affect different parts of society.
Private Interactions as a Foundation
For a service to feel private, there must never be any doubt about who you are communicating with. We’ve worked hard to build privacy into all our products, including those for public sharing. But one great property of messaging services is that even as your contacts list grows, your individual threads and groups remain private. As your friends evolve over time, messaging services evolve gracefully and remain intimate.
This is different from broader social networks, where people can accumulate friends or followers until the services feel more public. This is well-suited to many important uses — telling all your friends about something, using your voice on important topics, finding communities of people with similar interests, following creators and media, buying and selling things, organizing fundraisers, growing businesses, or many other things that benefit from having everyone you know in one place. Still, when you see all these experiences together, it feels more like a town square than a more intimate space like a living room.
There is an opportunity to build a platform that focuses on all of the ways people want to interact privately. This sense of privacy and intimacy is not just about technical features — it is designed deeply into the feel of the service overall. In WhatsApp, for example, our team is obsessed with creating an intimate environment in every aspect of the product. Even where we’ve built features that allow for broader sharing, it’s still a less public experience. When the team built groups, they put in a size limit to make sure every interaction felt private. When we shipped stories on WhatsApp, we limited public content because we worried it might erode the feeling of privacy to see lots of public content — even if it didn’t actually change who you’re sharing with.
In a few years, I expect future versions of Messenger and WhatsApp to become the main ways people communicate on the Facebook network. We’re focused on making both of these apps faster, simpler, more private and more secure, including with end-to-end encryption. We then plan to add more ways to interact privately with your friends, groups, and businesses. If this evolution is successful, interacting with your friends and family across the Facebook network will become a fundamentally more private experience.
Encryption and Safety
People expect their private communications to be secure and to only be seen by the people they’ve sent them to — not hackers, criminals, over-reaching governments, or even the people operating the services they’re using.
There is a growing awareness that the more entities that have access to your data, the more vulnerabilities there are for someone to misuse it or for a cyber attack to expose it. There is also a growing concern among some that technology may be centralizing power in the hands of governments and companies like ours. And some people worry that our services could access their messages and use them for advertising or in other ways they don’t expect.
End-to-end encryption is an important tool in developing a privacy-focused social network. Encryption is decentralizing — it limits services like ours from seeing the content flowing through them and makes it much harder for anyone else to access your information. This is why encryption is an increasingly important part of our online lives, from banking to healthcare services. It’s also why we built end-to-end encryption into WhatsApp after we acquired it.
In the last year, I’ve spoken with dissidents who’ve told me encryption is the reason they are free, or even alive. Governments often make unlawful demands for data, and while we push back and fight these requests in court, there’s always a risk we’ll lose a case — and if the information isn’t encrypted we’d either have to turn over the data or risk our employees being arrested if we failed to comply. This may seem extreme, but we’ve had a case where one of our employees was actually jailed for not providing access to someone’s private information even though we couldn’t access it since it was encrypted.
At the same time, there are real safety concerns to address before we can implement end-to-end encryption across all of our messaging services. Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things. When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion. We have a responsibility to work with law enforcement and to help prevent these wherever we can. We are working to improve our ability to identify and stop bad actors across our apps by detecting patterns of activity or through other means, even when we can’t see the content of the messages, and we will continue to invest in this work. But we face an inherent tradeoff because we will never find all of the potential harm we do today when our security systems can see the messages themselves.
Finding the right ways to protect both privacy and safety is something societies have historically grappled with. There are still many open questions here and we’ll consult with safety experts, law enforcement and governments on the best ways to implement safety measures. We’ll also need to work together with other platforms to make sure that as an industry we get this right. The more we can create a common approach, the better.
On balance, I believe working towards implementing end-to-end encryption for all private communications is the right thing to do. Messages and calls are some of the most sensitive private conversations people have, and in a world of increasing cyber security threats and heavy-handed government intervention in many countries, people want us to take the extra step to secure their most private data. That seems right to me, as long as we take the time to build the appropriate safety systems that stop bad actors as much as we possibly can within the limits of an encrypted service. We’ve started working on these safety systems building on the work we’ve done in WhatsApp, and we’ll discuss them with experts through 2019 and beyond before fully implementing end-to-end encryption. As we learn more from those experts, we’ll finalize how to roll out these systems.
Reducing Permanence
We increasingly believe it’s important to keep information around for shorter periods of time. People want to know that what they share won’t come back to hurt them later, and reducing the length of time their information is stored and accessible will help.
One challenge in building social tools is the “permanence problem”. As we build up large collections of messages and photos over time, they can become a liability as well as an asset. For example, many people who have been on Facebook for a long time have photos from when they were younger that could be embarrassing. But people also really love keeping a record of their lives. And if all posts on Facebook and Instagram disappeared, people would lose access to a lot of valuable knowledge and experiences others have shared.
I believe there’s an opportunity to set a new standard for private communication platforms — where content automatically expires or is archived over time. Stories already expire after 24 hours unless you archive them, and that gives people the comfort to share more naturally. This philosophy could be extended to all private content.
For example, messages could be deleted after a month or a year by default. This would reduce the risk of your messages resurfacing and embarrassing you later. Of course you’d have the ability to change the timeframe or turn off auto-deletion for your threads if you wanted. And we could also provide an option for you to set individual messages to expire after a few seconds or minutes if you wanted.
It also makes sense to limit the amount of time we store messaging metadata. We use this data to run our spam and safety systems, but we don’t always need to keep it around for a long time. An important part of the solution is to collect less personal data in the first place, which is the way WhatsApp was built from the outset.
Interoperability
People want to be able to choose which service they use to communicate with people. However, today if you want to message people on Facebook you have to use Messenger, on Instagram you have to use Direct, and on WhatsApp you have to use WhatsApp. We want to give people a choice so they can reach their friends across these networks from whichever app they prefer.
We plan to start by making it possible for you to send messages to your contacts using any of our services, and then to extend that interoperability to SMS too. Of course, this would be opt-in and you will be able to keep your accounts separate if you’d like.
There are privacy and security advantages to interoperability. For example, many people use Messenger on Android to send and receive SMS texts. Those texts can’t be end-to-end encrypted because the SMS protocol is not encrypted. With the ability to message across our services, however, you’d be able to send an encrypted message to someone’s phone number in WhatsApp from Messenger.
This could also improve convenience in many experiences where people use Facebook or Instagram as their social network and WhatsApp as their preferred messaging service. For example, lots of people selling items on Marketplace list their phone number so people can message them about buying it. That’s not ideal, because you’re giving strangers your phone number. With interoperability, you’d be able to use WhatsApp to receive messages sent to your Facebook account without sharing your phone number — and the buyer wouldn’t have to worry about whether you prefer to be messaged on one network or the other.
You can imagine many simple experiences — a person discovers a business on Instagram and easily transitions to their preferred messaging app for secure payments and customer support; another person wants to catch up with a friend and can send them a message that goes to their preferred app without having to think about where that person prefers to be reached; or you simply post a story from your day across both Facebook and Instagram and can get all the replies from your friends in one place.
You can already send and receive SMS texts through Messenger on Android today, and we’d like to extend this further in the future, perhaps including the new telecom RCS standard. However, there are several issues we’ll need to work through before this will be possible. First, Apple doesn’t allow apps to interoperate with SMS on their devices, so we’d only be able to do this on Android. Second, we’d need to make sure interoperability doesn’t compromise the expectation of encryption that people already have using WhatsApp. Finally, it would create safety and spam vulnerabilities in an encrypted system to let people send messages from unknown apps where our safety and security systems couldn’t see the patterns of activity.
These are significant challenges and there are many questions here that require further consultation and discussion. But if we can implement this, we can give people more choice to use their preferred service to securely reach the people they want.
Secure Data Storage
People want to know their data is stored securely in places they trust. Looking at the future of the internet and privacy, I believe one of the most important decisions we’ll make is where we’ll build data centers and store people’s sensitive data.
There’s an important difference between providing a service in a country and storing people’s data there. As we build our infrastructure around the world, we’ve chosen not to build data centers in countries that have a track record of violating human rights like privacy or freedom of expression. If we build data centers and store sensitive data in these countries, rather than just caching non-sensitive data, it could make it easier for those governments to take people’s information.
Upholding this principle may mean that our services will get blocked in some countries, or that we won’t be able to enter others anytime soon. That’s a tradeoff we’re willing to make. We do not believe storing people’s data in some countries is a secure enough foundation to build such important internet infrastructure on.
Of course, the best way to protect the most sensitive data is not to store it at all, which is why WhatsApp doesn’t store any encryption keys and we plan to do the same with our other services going forward.
But storing data in more countries also establishes a precedent that emboldens other governments to seek greater access to their citizen’s data and therefore weakens privacy and security protections for people around the world. I think it’s important for the future of the internet and privacy that our industry continues to hold firm against storing people’s data in places where it won’t be secure.
Next Steps
Over the next year and beyond, there are a lot more details and trade-offs to work through related to each of these principles. A lot of this work is in the early stages, and we are committed to consulting with experts, advocates, industry partners, and governments — including law enforcement and regulators — around the world to get these decisions right.
At the same time, working through these principles is only the first step in building out a privacy-focused social platform. Beyond that, significant thought needs to go into all of the services we build on top of that foundation — from how people do payments and financial transactions, to the role of businesses and advertising, to how we can offer a platform for other private services.
But these initial questions are critical to get right. If we do this well, we can create platforms for private sharing that could be even more important to people than the platforms we’ve already built to help people share and connect more openly.
Doing this means taking positions on some of the most important issues facing the future of the internet. As a society, we have an opportunity to set out where we stand, to decide how we value private communications, and who gets to decide how long and where data should be stored.
I believe we should be working towards a world where people can speak privately and live freely knowing that their information will only be seen by who they want to see it and won’t all stick around forever. If we can help move the world in this direction, I will be proud of the difference we’ve made.

Google is removing apps from Google Play that request permission to access call logs and SMS text message data but haven’t been manually vetted by Google staff.
The search and mobile giant said it is part of a move to cut down on apps that have access to sensitive calling and texting data.
Google said in October that Android apps will no longer be allowed to use the legacy permissions as part of a wider push for developers to use newer, more secure and privacy minded APIs. Many apps request access to call logs and texting data to verify two-factor authentication codes, for social sharing, or to replace the phone dialer. But Google acknowledged that this level of access can and has been abused by developers who misuse the permissions to gather sensitive data — or mishandle it altogether.
“Our new policy is designed to ensure that apps asking for these permissions need full and ongoing access to the sensitive data in order to accomplish the app’s primary use case, and that users will understand why this data would be required for the app to function,” wrote Paul Bankhead, Google’s director of product management for Google Play.
Any developer wanting to retain the ability to ask a user’s permission for calling and texting data has to fill out a permissions declaration.
Google will review the app and why it needs to retain access, and will weigh in several considerations, including why the developer is requesting access, the user benefit of the feature that’s requesting access and the risks associated with having access to call and texting data.
Bankhead conceded that under the new policy, some use cases will “no longer be allowed,” rendering some apps obsolete.
So far, tens of thousands of developers have already submitted new versions of their apps either removing the need to access call and texting permissions, Google said, or have submitted a permissions declaration.
Developers with a submitted declaration have until March 9 to receive approval or remove the permissions. In the meantime, Google has a full list of permitted use cases for the call log and text message permissions, as well as alternatives.
The last two years alone has seen several high-profile cases of Android apps or other services leaking or exposing call and text data. In late 2017, popular Android keyboard ai.type exposed a massive database of 31 million users, including 374 million phone numbers.

Eager to change the conversation from their years-long exposure of user data via Google+ to the bright, shining future the company is providing, Google has announced some changes to the way permissions are approved for Android apps. The new process will be slower, more deliberate and hopefully secure.
The changes are part of “Project Strobe,” a “root-and-branch review of third-party developer access to Google account and Android device data and our philosophy around apps’ data access.” Essentially they decided it was time to update the complex and likely not entirely cohesive set of rules and practices around those third-party developers and API access.
One of those roots (or perhaps branches) was the bug discovered inside Google+, which theoretically (the company can’t tell if it was abused or not) exposed non-public profile data to apps that should have received only a user’s public profile. This, combined with the fact that Google+ never really justified its own existence in the first place, led to the service essentially being shut down. “The consumer version of Google+ currently has low usage and engagement,” Google admitted. “90 percent of Google+ user sessions are less than five seconds.”
But the team doing the review has plenty of other suggestions to improve the process of informed consent to sharing data with third parties.
The first change is the most user-facing. When an application wants to access your Google account data — say your Gmail, Calendar and Drive contents for a third-party productivity app — you’ll have to approve each one of those separately. You’ll also have the opportunity to deny access to one or more of those requests, so if you never plan on using the Drive functionality, you can just nix it and the app will never get that permission.

These permissions can also be delayed and gated behind the actions that require them. For instance, if this theoretical app wanted to give you the opportunity to take a picture to add to an email, it wouldn’t have to ask up front when you download it. Instead, when you tap the option to attach a picture, it would ask permission to access the camera then and there. Google went into a little more detail on this in a post on its developer blog.
Notably there is only the option to “deny” or “allow,” but no “deny this time” or “allow this time,” which I find to be useful when you’re not totally on board with the permission in question. You can always revert the setting manually, but it’s nice to have the option to say “okay, just this once, strange app.”
The changes will start rolling out this month, so don’t be surprised if things look a little different next time you download a game or update an app.
The second and third changes have to do with limiting which data from your Gmail and messaging can be accessed by apps, and which apps can be granted access in the first place.
Specifically, Google is restricting access to these sensitive data troves to apps “directly enhancing email functionality” for Gmail and your default calling and messaging apps for call logs and SMS data.
There are some edge cases where this might be annoying to power users; some have more than one messaging app that falls back to SMS or integrates SMS replies, and this might require those apps to take a new approach. And apps that want access to these things may have trouble convincing Google’s review authorities that they qualify.
Developers also will need to review and agree to a new set of rules governing what Gmail data can be used, how they can use it and the measures they must have in place to protect it. For example, apps are not allowed to “transfer or sell the data for other purposes such as targeting ads, market research, email campaign tracking, and other unrelated purposes.” That probably puts a few business models out of the running.
Apps looking to handle Gmail data will also have to submit a report detailing “application penetration testing, external network penetration testing, account deletion verification, reviews of incident response plans, vulnerability disclosure programs, and information security policies.” No fly-by-night operations permitted, clearly.
There also will be additional scrutiny on what permissions developers ask for to make sure it matches up with what their app requires. If you ask for Contacts access but don’t actually use it for anything, you’ll be asked to remove that, as it only increases risk.
These various new requirements will go into effect next year, with application review (a multi-week process) starting on January 9; tardy developers will see their apps stop working at the end of March if they don’t comply.
The relatively short timeline here suggests that some apps may in fact shut down temporarily or permanently due to the rigors of the review process. Don’t be surprised if early next year you get an update saying service may be interrupted due to Google review policies or the like.
These changes are just the first handful issuing from the recommendations of Project Strobe; we can expect more to appear over the next few months, though perhaps not such striking ones. To say Gmail and Android apps are widely used is something of an understatement, so it’s understandable that they would be focused on first, but there are many other policies and services the company will no doubt find reason to improve.