Mobile app security, or “Who will inspect verifiers?”
Congratulations, you are the second person to
break into Van der Voda’s safe today.
Thus, Mr. Ocean,
you have entered the long ranks of those who
have made titanic efforts
to achieve the goal
and, in the end, become only the second.
You do not know the names of these people
because they are covered in oblivion.
Do you know the word oblivion?
This means that
everyone will forget about you forever. "
Mr. Night Fox (film" 12 Ocean Friends ")
Hello, Habr reader!
Imagine that you are a tailor and you made a man a suit to order. The man told you how he wants to look in this suit, where to go in it and how much he’s about to pay for it. You listened carefully to him, took all the measurements, sewed this beautiful dream costume with love, using all modern fashion trends. I followed all the wishes of my dear customer. And now the finest hour has come: the suit is ready, the man put it on, and he is happy, looking at himself in the mirror. In the evening, he called and said that his wife and guests on his anniversary liked him too. But one of the guests said that this suit has flaws: it is not yellow in color, it is impossible to extinguish a fire in it, anyone can steal this suit, and (both on!) - it does not have a hood and you cannot put a hammer or a saw in your pocket.
But let me tell you, man, what yellow color? What a fire and a hood? This is a suit for formal occasions. A man says that he himself is surprised and puzzled. He likes the suit, and it is sewn perfectly. And you advise him to hammer on this guest and not call him anymore in order to avoid some other garbage like that. You both laugh, wish each other a pleasant evening and say goodbye. The holder of Opinion is forgotten.
So we were in the place of this tailor. We made the client company a mobile application to order. We discussed each step, taking into account the needs of users (users of this particular application, and not generally ANY mobile application), agreed on each element. Everyone is happy, including users, but here a certain Guest appeared with his OPINION. And to put it somewhere in his OPINION, but it is so clumsy and angular that it is unlikely to fit into it.
Judge for yourself
All of us (almost all) use smartphones and mobile applications. And, of course, we all (some of us) think about the security of our personal data that we store on the phone. But, one way or another, most of this relies on application developers. Indeed, the task of safe storage and data transfer is one of the most important tasks that developers are engaged in. It is not surprising that the customer companies try to play it safe and sometimes give the developed application to the review of a third company (preferably, widely known).
We also had such an incident, and the application of our company fell into the so-called "check". The result was a kind of presentation describing how bad things were. Below we have given the arguments of the reviewers, our comments and conclusions. Enjoy
Connection to the server is made using the https protocol and TLS-traffic encryption, which is de facto the standard for modern applications. However, the implementation on the device does not meet all the requirements for the correct use of https. In particular, the application establishes a connection after receiving any TLS certificate. This vulnerability allows full listening of confidential application data:
• You can read all transmitted data in clear text or change them.
• PAN credit card numbers, CVC / CVV codes, and user’s personal data are leaked.
• Substitution of forms of payment, data interception.
Colleagues, well, after all, after all, not just any TLS certificate, but any trusted one at the operating system level. Of course, the user can “inadvertently” add a certificate to an attacker who gained control over the communication channel. Such a “vulnerability” is inherent in principle for all sites. Nevertheless, life does not stop, and people register and make purchases on sites. Here a lot depends on the caution of the user himself. In the case of an application, you can increase security by implementing a specific certificate used by the server (SSL Pinning) in the application. But this requires coordination with the customer and prevents the certificate (at least the corresponding key) from being changed on the server without updating the application.
The argument as a whole is not entirely accurate. Most applications, as well as browsers, operate at the same level of connection security. Written in this way, it can greatly scare the customer, which was clearly the goal.
Client data is stored on the device in clear text. Neither system security features (Keychain, iOS Data Protection) nor encryption of this data are used. This vulnerability allows:
On any device (without jailbreak and other modifications), when connected to a computer in a minute, get user data. This can be done, for example, by the iFunBox file manager.
This statement is misleading: it looks like colleagues were checking the application on an unlocked phone. The fact is that iOS Data Protection is used to protect data (NSFileProtectionComplete flag - and we double-checked it). This means that when a device with a configured pin or Touch ID is locked (locked), the file is encrypted, and can only be decrypted after unlocking. If the device is not locked, then talking about protecting data from an attacker who has physical access to the device is pointless: he can just launch the application and see everything.
We considered the option of increasing security using a pin code. However, its use on a data protection application is a controversial idea. If you make it optional, the user may not install it, just like not install it on the entire device. If you make it mandatory, users will have to enter two PIN codes: on the phone and on the application. In addition, without a PIN code on the device, things like mail, browser (with cache and saved passwords), calls and messages remain unprotected. Having access to them, an attacker can not only inflict significant damage, but also most likely gain access to a user’s account on the server (if it exists and, for example, has the “forgot password” function). And it’s unlikely that the application stores data more critical than the listed things.
Conclusion:The argument is erroneous. We believe that there was a mistake of the inspector or the inspector deliberately expected that the customer would make this mistake if he tried to verify it himself. You can’t read the data on a locked phone, and having a device in your hands you can do everything in an open application.
If you have jailbreak installed, you can remotely copy application data along with the database and other files.
Something to protect or guarantee “if jailbreak is installed” is a completely different story. We generally recommend not guaranteeing anyone anything “with jailbreak installed”. For example, this happens: github.com/iSECPartners/ios-ssl-kill-switch. How can I protect a user who sets this up for themselves? Is checking for installed jailbreak suggested? Firstly, any jailbreak check can be circumvented. Secondly, the user is unlikely to be aware that he has a jailbreak; and here it’s worth considering who we are protecting: the user from attacks or the application from the user. Thirdly, the presence of some checks can lead to problems when passing the appstore review, or when the application is running, because many checks are that the application is trying to do what applications on non-jailbreak devices are not allowed to do.
Conclusion: The argument is erroneous. We will not protect the application and user data from the user himself.
System screenshots are not masked, and may contain private payment data of customers. System screenshots are stored in the device’s memory and are easily accessible when connected to a computer.
Well, again, rather, an unlocked phone was checked. For system screenshots, “iOS Data Protection” is also used, and they are not available in a locked phone. And if the phone is unlocked and there is access to it, then it will not be difficult for anyone to see screenshots.
argument is far-fetched and designed to sow panic, because screenshots are protected at the system level.
application at startup does not check for the presence of a debugger, which allows you to restore the application algorithms and modify it.
Restore work algorithms - it is not clear what the purpose of this action is and why write about it? We do not have any secret algorithms, and you can modify the application only if there is a jailbreak, and we already wrote about this - this can only harm the user himself.
argument is meaningless. It is not clear why he was cited and whether the author understands the so-called “presentation” the meaning of what he writes.
As you can see, the safety points in the express analysis are affected by the correct ones, but their interpretation and presentation raise some questions.
Of all these arguments, it makes sense to improve only the security of the connection by applying certificate pinning. At one time, we proposed to do this to the customer, but did not advance in resolving this issue on the server side. Perhaps this analysis will help us in a dialogue with the customer as an additional argument.
In general, the comrades made a presentation in red with big words and, as you have already seen, with shallow statements and caused panic at the customer company.
Was this analysis objective and what is its purpose?
Are our client’s time and money spent correctly?
Is it generally ethical, from a professional point of view, to bring knowingly incorrect arguments against someone else's product?
As the saying goes, who will verify the one who checked?
We are always happy for constructive criticism of our products, but here the level of argumentation and the adequacy of the approach compromise the very idea of cross-analysis.
Have a nice day and success in developing applications ... secure applications.