Suramya's Blog : Welcome to my crazy life…

December 1, 2022

Analysis of the claim that China/Huawei is remotely deleting videos of recent Chinese protests from Huawei phones

Filed under: Computer Hardware,Computer Software,My Thoughts,Tech Related — Suramya @ 2:23 AM

There is an interesting piece of news that is slowly spreading over the internet in the past few hours where Melissa Chen is claiming over at Twitter that Huawei phones are automatically deleting videos of the protests that took place in China, without notifying their owners. Interestingly I was not able to find any other source reporting this issue. All references/reports of this issue are linking back to this tweet and based on this single tweet that is not supported by external validation. Plus the tweet does not even provide enough information to validate that this is happening other than a single video shared as part of the original tweet.


Melissa Chen claiming on Twitter that videos of protests are being automatically deleted by Huawei without notification

However, it is an interesting exercise to think how this could have been accomplished, what the technical requirements for this to work would look like and if this is something that would happen. So lets go ahead and dig in. In order to delete a video remotely, we would need the following:

  • The capability to identify the videos that need to be deleted without impacting other videos/photos on the device
  • The capability to issue commands to the device remotely that all sensitive videos from xyz location taken at abc time need to be nuked and Monitor the success/failure of the commands
  • Identify the devices that need to have the data on the looked at. Keeping in mind that the device could have been in airplane mode during the filming

Now, lets look at how each of these could be accomplished one at a time.

The capability to identify the videos that need to be deleted without impacting other videos/photos on the device

There are a few ways that we can identify the videos/photos to be deleted. If it was a video from a single source then we could have used a HASH value of the video to identify it and then delete. Unfortunately in this case the video in question is recorded by the device so each video file will have a separate hash value so this is not how we could do this.

The second option is to use the Metadata in the file, to identify the date & time along with the physical location of the video to be deleted. If videos were recorded within a geo-fence area in a specific timeframe then we potentially have the information required to identify the videos in question. The main problem would be that the user could have disabled geo-tagging of photos/videos taken by the phone or the date/time stamp might be incorrect.

One way to bypass this attempt to save the video would be to have the app/phone create a separate geo-location record of every photo/video taken by the device even when GPS is disabled or Geo tagging is disabled. This would require a lot of changes in the OS/App file and since a lot of people have been looking at the code in Huawei phones for issues ever since there was an accusation that they are being used by China to spy on western world, it is hard to imagine this would have escaped from scrutiny.

If the app was saving the data in the video/photo itself rather than a separate location then it should be easy enough to validate by examining the image/video data of photos/videos taken by any Huawei phone. But I don’t see any claims/reports that prove that this is happening.

The capability to issue commands to the device remotely that all sensitive videos from xyz location taken at abc time need to be nuked and Monitor the success/failure of the commands

Coming to the second requirement, Huawei or the government would need the capability to remotely activate the functionality to delete the videos. In order to do this the phone would need to be connecting to a Command & Control (C&C) channel frequently to check for commands. Or the phone would have something listening to remote commands from a central server.

Both of these are hard to disguise and hide. Yes, there are ways to hide data in DNS queries and other such methods to cover the tracks but thanks to Botnets, malware and Ransomware campaigns the ability to identify hidden C&C channels is highly developed and it is hard to hide from everyone looking at this. If the phone has something listening to commands then a scan of the device for open ports/apps listening to connections would be an easy thing to check and even if the app listening is disguised it should be possible to identify that something is listening.

You might say that the commands to activate might be hidden in the normal traffic going to & from the device to the Huawei servers and while that is possible we can check for it by installing a root certificate and passing all the traffic to/from the device via a proxy to be analyzed. Not impossible to do but hard to achieve without leaving signs, and considering the scrutiny these phones are going through hard to accept that this is something that is happening without anyone finding out about it.

Identify the devices that need to have the data on the looked at. (Keeping in mind that the device could have been in airplane mode during the filming)

Next, we have the question on how would Huawei identify the devices that need to run the check for videos. One option would be to issue the command to all their phones anywhere in the world. This would potentially be noisy and there is a possibility that a sharp eyed user catches the command in action. So far more likely option would be for them to issue it against a subset of their phones. This subset could be all phones in China, all phones that visited the location in question around the time the protest happened or all phones that are there in or around the location at present.

In order for the system to be able to identify users in an area, they have a few options. One would be to use GPS location tracking which would require the device to constantly track its location and share with a central location. Most phones already do this. One potential problem would be when users disable GPS on the device but other than that this would be an easy request to fulfill. Another option is to use cell tower triangulation to locate/identify the phones in the area at a given time. This is something that is easily done at the provider side and from what I read quite common in China. Naomi Wu AKA RealSexyCyborg had a really interesting thread on this a little while ago that you should check out.

This doesn’t even account for the fact that China has CCTV coverage across most of its jurisdiction and claim to have the ability to run Facial recognition across this massive amount of video collected. So, it is quite easy for the government to identify the phones that need to be checked for sensitive photos/videos with existing & known technology and ability.

Conclusion/Final thoughts

Now also remember that if Huawei had the ability to issue commands to its phones remotely then they also have the ability to extract data from the phones, or plant information on the phone. Which would be a espionage gold mine as people use their phones for everything and have then with them always. Loosing the ability to do this just to delete videos is not something that I feel China/Huawei would do as harm caused by the loss of intelligence data would far outweigh the benefits of deleting the videos. Do you really think that every security agency, Hacker Collective, bored programmers, Antivirus/cybersec firms would not immediately start digging into the firmware/apps on any Huawei phone once it was known and confirmed that they are actively deleting stuff remotely.

So, while it is possible that Huawei/China has the ability to scan and delete files remotely I doubt that this is the case right now. Considering that there is almost no reports of this happening anywhere and no independent verification of the same plus it doesn’t make sense for China to nuke this capability for such a minor return.

Keeping that in mind this post seems more like a joke or fake news to me. That being said, I might be completely mistaken about all this so if you have additional data or counter points to my reasoning above I would love for you to reach out and discuss this is more detail.

– Suramya

November 29, 2022

Twitter: How the user experience has been changing over the last month

Filed under: My Thoughts — Suramya @ 8:46 PM

Its been about a month since Twitter has been under the new management and it is an understatement to say that things have been chaotic over there with Musk in control. There have been plenty of articles written about what is going on at Twitter so I am not going to go into that, instead since it has been a month I wanted to share my experience on how the user experience at Twitter has changed, thoughts about its future and general musings.

Personally, I don’t see much of a difference when I use the chronological view using Tweetcaster (which is the Twitter client I prefer). It just shows me all the posts by the people I follow in chronological order which is the same as how things have been so far. However, Most of the infosec folks, and a lot of the experts have reduced their posts on Twitter and instead are posting on Mastodon now. Some post in both locations others have moved completely. Looking at the accounts, it looks like most of the non-white, non-male accounts have moved to Mastodon since the reduction in moderation means they are facing increasing amount of harassment from idiots. I have also been reading about how some of the functionality on the site is flaky and recently the block function was not working intermittently.

In addition to reading the chronological feed I also occasionally view the feed in the Recommended View or Home View as Twitter calls it and there is a drastic change in the content shown when I use this view. Earlier I would get about 80% of content from the people I follow and about 20% of other trending posts and recommendations. Now, it is the other way round. Last I checked, in the first 10 pages of scrolling only 3-4 tweets were from accounts I follow, rest all were recommendations and ‘popular’ tweets.

Specifically, it appears that tweets which talk positively about Musk are promoted more via the algorithm and out of 10 tweets shown at least 2-3 would be tweets praising Musk. Based on my likes and accounts I follow this should not be the case as a lot of them are fairly critical of how he is doing things. Worse case I would expect to see an even breakdown of tweets in favor and tweets against him as he is quite controversial right now. But, it looks like the algo has been tweaked to give higher priority to tweets that praise him. Please note that this is anecdotal evidence based on what I see and felt rather than a formal study as such of the algo.

In addition a lot more crypto scam accounts are showing up in my feeds as well. This is a scary thing because they look and sound quite reasonable but end of the day they are scams. The recommendation system that creates the feed for every user seems to be quite broken and a lot of the stuff being shown to me is stuff that I am not interested in. For example, today my feed was flooded with Bollywood movie news which is something I am not at all interested in.

If I stay with the chronological view then things appear to be normal. Most of the brands and services are still there on twitter and it is still a good way to reach out to companies for escalation or help. How much longer that will continue depends on how much more chaotic things are at Twitter offices. That being said I don’t see the service shutting down soon but there is a good chance that there will be a significant outage in the near future. If the trend of folks moving away continues then Twitter will end up being becoming less diverse than what it is today which is a shame.

I don’t see myself stopping from using it everyday yet (as long as I stick to the chronological view of my timeline). But a lot of the interesting posters from twitter have moved to Mastodon so I will probably be spending more time over there as time passes.

On a side note, I have been adding more features to my program to extract data from Twitter and the new version will be released in the next day or two.

Well this is all for now. Will post more later.

– Suramya

November 28, 2022

Internet Archive makes over 500 Palm Pilot apps available online for free

Filed under: Interesting Sites,Tech Related — Suramya @ 5:05 AM

The Palm Pilot was the first ‘smart’ device that I owned, and coincidentally it was the first device that I bought with my own money, so it always has a special place in my heart. I started off with the Palm V and then upgraded to the m505 when it came out. I loved the device and used it almost constantly for a long time. Unfortunately, they made a bunch of bad business decisions and the company collapsed.

Now, the Internet Archive has created an online archive of 565 Palm Pilot apps available to run in your web browser and on touchscreen devices. The apps are not as sophisticated as what you get nowadays but they are a blast of the past and some of them stand up to the passage of time quite well.

Check out the archive at: Software Library: Palm and Palmpilot.
More details on the project: The Internet Archive just put 565 Palm Pilot apps in your web browser

– Suramya

November 19, 2022

I am a speaker at SmartBharat 2022 Conference

Filed under: My Life,Tech Related — Suramya @ 11:56 PM

Happy to announce that I am one of the speakers at SmartBharat 2022 and I will be presenting on “IoT and Opensource: Re-purposing hardware & Improving interoperability“. My session is scheduled for 24 November at 12:30 PM in hall 2. As a kid I would read EFY regularly and now I am presenting at one of their conferences so this is a pretty big deal for me.


You can register for the conference at: https://www.iotshow.in/

If you are coming for the conference do stop by and say hello, I am planning on being there for all three days of the conference. Post the conference I will share the slides (and the video if possible) here.

– Suramya

November 18, 2022

Twitter Extract: Downloading data not exposed in the Official Data export

Filed under: Computer Software,Software Releases — Suramya @ 2:46 PM

It looks like Twitter is imploding and even though I don’t think that it will go down permanently it seemed like a good time to export data so that I have a local backup available. Usually I just ask Twitter to export my data but this time I needed additional data as I was preparing a backup that would work even when Twitter was down completely. The Twitter Export didn’t give me all the data I wanted, specifically I needed an export of the following which wasn’t there in the official export:

  • Owned Lists (Including Followers and Members of the list)
  • Subscribed Lists with Followers and Members of the List
  • List of all Followers (ScreenName, Fullname and ID)
  • List of all Following (ScreenName, Funnname and ID)

So I created a script that exports the above. It is available for download at: Github: TwitterExtract. Instructions for installation and running are there in the ReadMe file.

This was created as a quick and dirty solution so it is not productionalized (i.e. it doesn’t have a lot of error checking, hardening etc) but it does what it is supposed to do. Check it out and let me know what you think. Bug Fixes and additional features are welcome…

– Suramya

November 15, 2022

Extracting Firefox Sites visited for archiving

Filed under: Computer Software,Linux/Unix Related,Tech Related — Suramya @ 3:01 AM

I have been using Firefox since it first version (0.1) launched back in 2003. At that time it was called Phoenix but had to change its name due to a trademark claim from Phoenix Technologies to Firebird which was then renamed to Firefox. Over the years I have upgraded in place so I had assumed that all my Browser History etc was still safely stored in the browser. A little while ago I realized that this wasn’t the case as there is a history page limit defined under the about:config. The property is called

places.history.expiration.transient_current_max_pages: 137249

and on my system it is configured for 137249 entries. This was a disappointment as I wanted to save an archive of the sites I have visited over the years so I started looking at how to export the history from Firefox from the command line so that I can save it in another location as part of my regular backup. I knew that the history is stored in a SQLite database so I looked at the contents of the DB using a SQLite viewer. The DB was simple enough to understand but I didn’t want to recreate the wheel so I searched on Google to see if anyone else has already written the queries to extract the data and found this Reddit post that gave the command to extract the data into a file.

I tried the command out and it worked perfectly with just one small hitch. The command would not run unless I shutdown Firefox as the DB file was locked by FF. This was a big issue as it meant that I would have to close the browser every time the backup ran which is not feasible as the backup process needs to be as transparent and seamless as possible.

Another search for the solution pointed me to this site that explained how to connect to a locked DB in Read Only mode. Which was exactly what I needed, so I took the code from there and merged it with the previous version and came up with the following command:

sqlite3 'file:places.sqlite?immutable=1' "SELECT strftime('%d.%m.%Y %H:%M:%S', visit_date/1000000, 'unixepoch', 'localtime'),
                                                   url FROM moz_places, moz_historyvisits WHERE moz_places.id = moz_historyvisits.place_id ORDER BY visit_date;" > dump.out 

this command gives us an output that looks like:

28.12.2020 12:30:52|http://maps.google.com/
28.12.2020 12:30:52|http://maps.google.com/maps
...
...
14.11.2022 04:37:17|https://www.google.com/?gfe_rd=cr&ei=sPvqVZ_oOefI8AeNwZbYDQ&gws_rd=ssl,cr&fg=1

Once the file is created, I back it up with my other files as part of the nightly backup process on my system. In the next phase I am thinking about dumping this data into a PostgreSQL DB so that I can put a UI in front of it that will allow me to browse/search through the file. But for now this is sufficient as the data is being backed up.

I was able to get my browsing history going back to 2012 by restoring the oldest Firefox backup that I have on the system and then extracting the data from it. I still have some DVD’s with even older backups so when I get some time I will restore and extract the data from there as well.

Well this is all for now. Will write more later.

– Suramya

November 14, 2022

IBM Unveils the worlds largest Quantum Computer with 433 qubits

Filed under: My Thoughts,Quantum Computing — Suramya @ 2:01 AM

Scaling up Quantum computers has become a race between the various players in the market and IBM has raised the stakes by unveiling a 433 qubits Quantum computer that is more than a 3x increase from their previous setup of 127 qubits. Even with this massive gain they are still ways off from a making a 4000 qubit computer by 2025 which is their goal.

In this new setup IBM replaced the “quantum chandelier” used in the previous processors with flexible ribbon cables that are designed for cryogenic environments. These new cables allow a more efficient flow of microwave signals which in turn decreased the interference caused by the cables. This gave them a 77% increase in the number of connections to the chip, which in turn enabled them to scale up more easily. They also separated the wires and components for control and readout into their own layers, which further reduced the interference with the qubits.

The new setup also includes a state of the art cryo-CMOS prototype controller chip implemented using 14-nanometer FinFET technology that reduces the power requirement for the setup from about 100 watts per qubit to about 10 milliwatts per qubit. The new beta update for Qiskit Runtime allows the user to trade speed for reduced error count and a new option called Qiskit primitives called a “resilience level” lets users dial in the cost/accuracy trade that is suitable to the task being worked on. Both functionality is expected to be ready for production release by 2025.

Quantum computing makes my head hurt but there is no doubt that it is changing the computing world in a massive way.

Source:
* IEEE Spectrum: IBM Unveils 433-Qubit Osprey Chip
* New Scientist: IBM unveils world’s largest quantum computer at 433 qubits

– Suramya

November 9, 2022

FOSS: Asking folks to run their own servers/services is not the answer

Filed under: My Thoughts,Tech Related — Suramya @ 1:06 AM

A few days ago a discussion was going on in a FOSS (Free and Open Source Software) group that I am part of about Twitter and how it is imploding due to the recent changes. One of the members commented that “Both Twitter and Gmail are private services (not public utilities). Hence FOSS. Hence self-host your blog / email.” This is a very problematic view that is unfortunately quite common amongst techies. They (we) tend to believe that everyone has the time, knowledge, interest and resources to do things the way we do.

In the early 2000’s I hosted my site & blog on a VPS (Virtual Private Server) which I maintained on my own. It was a great experience because I got to learn Linux Sysadmin skills on a live environment and I did it for a few years. Then as my responsibilities and workload started increasing I had less time to devote to managing the server, plus I had issues with the costing so ended up moving hosting providers and to a shared hosting plan. Since I was moving to a different role, I just wanted to host my site and not worry about managing the server and this move allowed me to do that. I can move back to a VPS if I need to since I have the tech background and skills to manage it. Expecting everyone to do the same is nonsensical and impractical. I know the time it took me to walk my parents through how to access their email from various computers & phones. Just thinking about asking them to manage sendmail/postfix servers and secure them is enough to give me nightmares about hours on the phone trouble shooting.

FOSS is a great thing, it has made life easier and allowed us to retain control of the devices that we use to manage a large portion of our life. However, it is not practical to expect everyone to have the skills to host their own servers. Imagine if other services did the same thing, you would need to run your own sewage treatment plant to process your waste and have to manage your own power generation plant, or grow your own food. That sounds pretty nonsensical right? Which is how you sound when you tell folks to run their own servers for stuff that shouldn’t need it (like email or social media). Unless you want to only communicate with a microscopic portion of the population and feel superior to everyone.

Our goal is to encourage people to use FOSS whenever possible and that requires us to make the software usable, stable, have a shallow learning curve and smooth/easy onboarding. If you think that people will learn about server configs to access your product then you are dreaming. For example, GIMP is an awesome software but its UI sucks, which is why it has been unable to gain popularity and beat Photoshop. One of the great mods for GIMP which came out in 2006 called GIMPShop modified the UI to make it similar to Photoshop and people loved it. Other software / systems have the same problem as well. The most recent example is Mastodon which is a pretty cool software but onboarding process that explains how you would access it and setup accounts is something that I still am confused about. I had a client I worked with early in my carrier who would ask me to “Just make it work” when faced with complicated software setup. She was smart as hell but didn’t have the time to waste to setup/configure software as that took her time away from her core responsibilities.

The general user will go for ease of use, they will go for easy onboarding and accessibility. IRC was an amazing protocol but the clients sucked (I mean they worked but didn’t have mobile clients and were not user-friendly) as the years passed newer protocols and clients came into the picture and they had snazzy UI and clients (e.g. Slack) which enabled them to take over as the communication channel for a lot of communities. We can moan and complain that IRC was much better but from the end user perspective it wasn’t better because it didn’t allow them to do what they wanted using the devices they wanted to use. Like it or not mobile is hear to stay and not having a native mobile client made IRC a hard sell. (There are a few clients now, but the damage is done).

Usability is not a curseword. We need to start embracing making the software/systems we create more userfriendly. I am not saying remove the advanced / power functionality, I would be one of the first to leave if you did that. A good example on how to balance the two is the approach Firefox takes: they have the general UI for all users with sensible defaults and a configuration setup that allows power users to go in and modify pretty much every aspect of the system.

Coming back to Twitter, the fix for this current issue is not to run our own servers but to make the existing systems interoperable the same way Email systems are interoperable. Cory Doctorow has a fantastic post “How to ditch Facebook without ditching your friends” where he talks about how this could work. It would require pressure (regulatory/government/user) on the companies to adopt this model but in the long run that would removed the walled gardens that have popped up everywhere and restore the old more distributed style of internet.

I still need to figure out if I want to join a Mastodon server and if so which one, I will probably look into this later this month once I have some free time.

Well this is all for now. Will post more later.

– Suramya

October 21, 2022

Disable Dark Theme in the Private Browsing mode in Firefox 106

Filed under: Computer Software,Computer Tips,Knowledgebase,Tech Related — Suramya @ 10:09 AM

A lot of people like Dark themes for their apps but I am not one of them. For me the Dark mode strains my eyes more so I usually disable it as soon as possible. In the latest Firefox update (v106), Firefox changed a bunch of defaults and one of the changes is that when you open a window in incognito mode it uses the Dark theme by default. As per the release notes this is a conscious decision:

We also added a modern look and feel with a new logo and updated it so Private Browsing mode now defaults to dark theme, making it easier to know when you are in Private Browsing mode.

The dark theme really annoys me so I started looking for ways to disable it. Unfortunately, it can’t be disabled without having to change my default Theme (which is to use the System Defaults) which I didn’t want to do and a quick internet search didn’t return any useful results. So I decided to check out the about:config section to see if there is a hidden setting and lo-behold it was there. A quick change disabled the theme for the Private browsing mode and things were back to normal.

The steps to disable the dark theme in incognito mode are as follows:

  • Type about:config in the address bar and press Enter.
  • A warning page may appear. Click Accept the Risk and Continue to go to the about:config page.
  • Search for “theme” in the Search preference name box at the top of the page and you will see an entry for “browser.theme.dark-private-windows”
  • Double click on “True” for the entry to change the value to false.
  • The entry should look like the following. Then you can close the tab and you are done.


To revert the change, just repeat the steps and set the value back to True.

– Suramya

October 20, 2022

I am a Certified Threat Intelligence Analyst (CTIA) now

Filed under: Computer Security,My Life — Suramya @ 10:17 AM

I’m happy to share that I’ve obtained a new certification: CTIA (Certified Threat Intelligence Analyst) from EC-Council.


Certification Number Certification Name Issue Date Expiry Date
ECC8907421563 Certified Threat Intelligence Analyst October 17, 2022 October 16, 2025

With this I have completed 4 out of the 5 certifications I am eligible for after my degree in Cyber Security. The last one is CHFI and I will be attempting that shortly.

Well this is all for now, will write more later.

– Suramya

« Newer PostsOlder Posts »

Powered by WordPress