Suramya's Blog : Welcome to my crazy life…

August 24, 2018

Fixing the appstreamcli error when running apt-get update

Filed under: Computer Software,Knowledgebase,Linux/Unix Related,Techie Stuff — Suramya @ 12:05 AM

Over the past few days everytime I tried to update my Debian system using apt-get it would fail with the following error message:

(appstreamcli:5574): GLib-CRITICAL **: 20:49:46.436: g_variant_builder_end: assertion '!GVSB(builder)->uniform_item_types || 
GVSB(builder)->prev_item_type != NULL || g_variant_type_is_definite (GVSB(builder)->type)' failed

(appstreamcli:5574): GLib-CRITICAL **: 20:49:46.436: g_variant_new_variant: assertion 'value != NULL' failed

(appstreamcli:5574): GLib-ERROR **: 20:49:46.436: g_variant_new_parsed: 11-13:invalid GVariant format string
Trace/breakpoint trap
Reading package lists... Done
E: Problem executing scripts APT::Update::Post-Invoke-Success 'if /usr/bin/test -w /var/cache/app-info -a -e /usr/bin/appstreamcli; then appstreamcli refresh-cache > 
/dev/null; fi'
E: Sub-process returned an error code

Spent a couple of hours trying to figure out what was causing it and was able to identify that it was caused because of a bug in appstream as tunning the command manually also failed with the same error. When I tried to remove the package as recommended by a few sites it would have removed the entire KDE desktop from my machine which I didn’t want so I was at a loss as to how to fix the problem. So I put the update on hold till I had a bit more time to research the issue and identify the solution.

Today I got some free time and decided to try again and after a little bit of searching stumbled upon the following Bug Report (#906544) where David explained that the error was caused due to a bug in the upstream version of appstream and a little while later Matthias commented that the issue is fixed in the latest version of the software and it would flow down to the Debian repositories in a little bit. Normally I would have just done an apt-get update and then install to get the latest package but since the whole issue was that I couldn’t get the system to finish the update command I had to manually install the package.

To do that I went to the Debian site and opened the software package list for Debian Unstable (as that is what I am using) and searched for appstream. This gave me a link to the updated package (0.12.2-2) that fixed the bug (I had 0.12.2-1 installed). Once I downloaded the package (Make sure you download the correct package based on your system architecture) I manually installed it using the following command as root:

dpkg -i appstream_0.12.2-2_amd64.deb

This installed the package and I was then able to do an apt-get update successfully. I still get the GLib-CRITICAL warnings but that apparently can be ignored without issues.

Hope this helps people who hit the same issue (or reminds me of the solution if/when I hit the issue again).

– Suramya

August 23, 2018

Identifying Programmers by their Coding Style

Filed under: Computer Security,Computer Software,Techie Stuff — Suramya @ 8:42 PM

There is an interesting development in the field of identifying people by what they write. As some of you may already know researchers have been able to identify who wrote a particular text based on the analysis of things like word choice, sentence structure, syntax and punctuation using a technique called stylometry for a while now but it was limited to natural languages and not artificial ones like programming languages.

Now there is new research by Rachel Greenstadt & Aylin Caliskan who are professors of computer science at Drexel University & at George Washington University respectively that proves that code, like other forms of writing is not anonymous. They used Machine Learning algorithms to de-anonymize coders and the really cool part is that they can do this even with reverse compiled code from Binaries with a reasonable level of confidence. So you don’t need access to the original source code to be able to identify who coded it. (Assuming that we have code samples from them in the training DB)

Here’s a simple explanation of how the researchers used machine learning to uncover who authored a piece of code. First, the algorithm they designed identifies all the features found in a selection of code samples. That’s a lot of different characteristics. Think of every aspect that exists in natural language: There’s the words you choose, which way you put them together, sentence length, and so on. Greenstadt and Caliskan then narrowed the features to only include the ones that actually distinguish developers from each other, trimming the list from hundreds of thousands to around 50 or so.

The researchers don’t rely on low-level features, like how code was formatted. Instead, they create “abstract syntax trees,” which reflect code’s underlying structure, rather than its arbitrary components. Their technique is akin to prioritizing someone’s sentence structure, instead of whether they indent each line in a paragraph.

This is both really cool and a bit scary because suddenly we have the ability to identify who wrote a particular piece of code. This removes or atleast reduces the ability of people to release code/software anonymously. This is a good thing when we look at a piece of Malware or virus because now we can find out who wrote it making it easier to prosecute cyber criminals.

However the flip side is that we can now also identify people who write code to secure networks, bypass restrictive regime firewalls, create privacy applications etc. There are a lot of people who contribute to opensource software but don’t want to be identified for various reasons. For example if a programmer in China created a software that allows a user to bypass the Great Firewall of China they would definitely not want the Chinese government to be able to identify them for obvious reasons. Similarly there are folks who wrote some software that they do not want to be associated with their real name for some reason and this would make it more difficult for them to do so.

But this is not the end of the world, there are ways around this by using software to scramble the code. I don’t think many such systems exist right now or if they do they are at a nacent stage. If this research is broadly applied to start identifying coders then the effort to write such scramblers would take high priority and lots of very smart people would start focusing their efforts to invalidate the detectors.

Well this is all for now. Will write more later.

– Suramya

Original source: Schneier’s Blog

September 27, 2016

How to install Tomato Firmware on Asus RT-N53 Router

Filed under: Computer Software,Knowledgebase,Techie Stuff,Tutorials — Suramya @ 11:43 PM

I know I am supposed to blog about the all the trips I took but wanted to get this down before I forget what I did to get the install working. I will post about the trips soon. I promise 🙂

Installing an alternate firmware on my router is something I have been meaning to do for a few years now but never really had the incentive to investigate in detail as the default firmware worked fine for the most part and I didn’t really miss any of the special features I would have gotten with the new firmware.

Yesterday my router decided to start acting funny, basically every time I started transferring large files from my phone to the desktop via sFTP over wifi the entire router would crash after about a min or so. This is something that hasn’t happened before and I have transferred gigs of data so I was stumped. Luckily I had a spare router lying around thanks to dad who forced me to carry it to Bangalore during my last visit. So I swapped the old router with the new one and got my work done. This gave me an opportunity as I had a spare router sitting on my desk and some time to kill so I decided to install a custom firmware on it to play with it.

I was initially planning on installing dd-wrt on it but their site was refusing to let me download the file for the RT-N53 model even though the wiki said that I should be able to install it. A quick web search suggested that folks have had a good experience with the Tomato by Shibby firmware so I downloaded and installed it by following these steps:

Download the firmware file

First we need to download the firmware file from the Tomato Download site.

  • Visit the Tomato download Section
  • Click on the latest Build folder. (I used build5x-138-MultiWAN)
  • Click on ‘Asus RT-Nxx’ folder
  • Download the ‘MAX’ zip file as that has all the functionality. (I used the tomato-K26-1.28.RT-N5x-MIPSR2-138-Max.zip file.)
  • Save the file locally
  • Extract the ZIP file. The file we are interested in is under the ‘image’ folder with a .trx extension

Restart the Router in Maintenance mode

  • Turn off power to router
  • Turn the power back on while holding down the reset button
  • Keep holding reset until the power light starts flashing which will mean router is in recovery mode

Set a Static IP on the Ethernet adapter of your computer

For some reason, you need to set the IP address of the computer you are using to a static IP of 192.168.1.2 with subnet 255.255.255.0 and gateway 192.168.1.1. If you skip this step then the firmware upload fails with an integrity check error.

Upload the new firmware

  • Connect the router to a computer using a LAN cable
  • Visit 192.168.1.1
  • Login as admin/admin
  • Click Advanced Setting from the navigation menu at the left side of your screen.
  • Under the Administration menu, click Firmware Upgrade.
  • In the New Firmware File field, click Browse to locate the new firmware file that you downloaded in the previous step
  • Click Upload. The uploading process takes about 5 minutes.
  • Then unplug the router, wait 30 seconds.
  • Hold down the WPS button while plugging it back in.
  • Wait 30 seconds and release the WPS button.

Now you should be using the new firmware.

  • Browse to 192.168.1.1
  • Login as admin/password (if that doesn’t work try admin/admin)
  • Click on the ‘reset nvram to defaults’ link in the page that comes up. (I had to do this before the system started working but apparently its not always required.)

Configure your new firmware

That’s it, you have a router with a working Tomato install. Go ahead and configure it as per your requirements. All functionality seems to be working for me except the 5GHz network which seems to have disappeared. I will play around with the settings a bit more to see if I can get it to work but as I hardly ever connected to the 5GHz network its not a big deal for me.

References

The following sites and posts helped me complete the install successfully. Without them I would have spent way longer getting things to work:

Well this is it for now. Will post more later.

– Suramya

February 25, 2016

Indian Patent office rejects Software patents

Filed under: Computer Software,My Thoughts — Suramya @ 8:00 PM

As you know software patents are something of a scourge in the computer industry and are hated for the most part (except by the companies using them to make money/stifle innovation and competition). There is extensive debate on the topic all of which boils down to the following three questions:

  • Should software patents even be allowed? If they are then how do we define the boundary between patentable and non-patentable software?
  • Is the inventive step and non-obviousness requirement is applied too loosely to software?
  • Are software patents discouraging innovation instead of encouraging it?

The Indian patent office has ruled on 19th Feb 2016 that software patents discourage innovation by using the following three part test to determine the patentability of Computer Related Inventions (CRIs), which precludes software from being patented:

  • Openly construe the claim and identify the actual contribution;
  • If the contribution lies only in mathematical method, business method or algorithm, deny the claim;
  • If the contribution lies in the field of computer programme, check whether it is claimed in conjunction with a novel hardware and proceed to other steps to determine patentability with respect to the invention.. The computer programme in itself is never patentable. If the contribution lies solely in the computer programme, deny the claim. If the contribution lies in both the computer programme as well as hardware, proceed to other steps of patentability.

This is a great step in ensuring that useless/basic idea’s don’t get patented and stifle innovation.

– Suramya

Source: Press Release: Indian Patent Office Says No to Software Patents

April 30, 2015

Microsoft is becoming more and more OpenSource Friendly

Filed under: Computer Software,My Thoughts — Suramya @ 8:32 PM

Gone are the days when MS compared open source software as a cancer. If you are wondering what I meant by that statement then here’s a brief history lesson: Back in 2001 Steve Ballmer, then CEO of MS said that “Linux is a cancer that attaches itself in an intellectual property sense to everything it touches. He made other similar statements and accusations over the years during his time at the head of MS. Now that he is finally out of the picture MS has suddenly gotten a lot more friendly to the Open Source movement and over the past few months has made major announcements to woo developers back to the Windows eco system.

Today MS made two major announcements at it’s Build Developer Conference that mark another step in the right direction for the company. The first was the Launch of Visual Studio Code, A Free Cross-Platform Code Editor For OS X, Linux And Windows.

This is the first release of a cross-platform code editor from Microsoft as till now all of their offerings required you to be running Windows. Which immediately prevented all developers running Linux or Mac OS from using their software. This is no longer the case, however it still remains to be seen how many folks switch to this new editor from their existing favorites. As you know that arguments/discussions on which editor is the best for development is akin to a religious war for developers. So not sure how many will switch to the new IDE.

Please note that this is a Preview release so is not ready for prime time yet and that also means that the software sends data back to MS. From the download site: “When this tool crashes, we automatically collect crash dumps so we can figure out what went wrong. If you don’t want to send your crash dumps to Microsoft, don’t install this tool. “. Don’t think they can be clearer than that about what they are up to.

Visual Studio Code offers developers built-in support for multiple languages and as Microsoft noted in today’s Build keynote, the editor will feature rich code assistance and navigation for all of these languages. JavaScript, TypeScript, Node.js and ASP.NET 5 developers will also get a set of additional tools.

The editor features all of the standard tools you would expect from a modern code editor, including syntax highlighting, customizable keyboard bindings, bracket matching and snippets. It also works with Git out of the box

The IDE is available for download at this site.

The second announcement was the release of their .NET Distribution For Linux And Mac. This is a follow up to their promise back in Nov 2014 to release the core features of their .NET platform for Linux and Mac.

Microsoft says it is taking .NET cross-platform in order to build and leverage a bigger ecosystem for it. As the company also noted shortly after the original announcement, it decided that, to take .NET cross-platform, it had to do so as an open source project. To shepherd it going forward, Microsoft also launched the .NET Foundation last year.

You can download the Preview builds for the .NET core from their site.

Additional details on their announcement and other things in the pipeline are available on their blog: .NET Announcements at Build 2015.

Well this is all for now. I just finished downloading their new IDE so I am going to go try installing it and see how it looks/works. Who knows I might actually like it. 🙂

– Suramya

January 8, 2015

Microsoft Office now available for Android tablets

Filed under: Computer Software,Techie Stuff — Suramya @ 11:35 PM

MS is spending a lot of time and effort trying to overcome the cloud of their previous actions where they did their level best to eradicate their competitors by any means necessary, even if they were morally in a grey area. The latest volley in this effort is their move to make MS Office available on the Android tablets for free. They have MS Word, MS Excel and MS Powerpoint available and the reviews so have have been very good even though the apps are technically still in Preview mode.

Today, we are excited to announce that we are further expanding the preview. We want more feedback from more users to ensure that Office apps work well on a range of different Android tablets before launching the official apps. To participate in the preview, you can use an ARM-based Android tablet running KitKat or Lollipop, with a screen size between 7″ and 10.1″. Starting today, anyone can go to Google Play and download the Word, Excel and PowerPoint preview apps. No waitlist. No requesting access. Just go and download the apps!

Office apps are one of the apps that need to be there on every OS. I have tried a lot of the alternatives like OpenOffice and other clones but I keep coming back to MS Office because of the stability and compatibility that it gives me with other Office users. On my Linux machine I use Crossover to have a native install of Office and it works great. When I get some time and restore Android to my Tablet (I am trying to install Kali Linux on it) I will try Office out on it even though I don’t see myself editing a lot of documents on the tablet.

Well this is all for now. Will write more later.

– Suramya

Source: androidcentral.com and Microsoft Blog Announcement

January 7, 2015

Over 2,400 classic DOS games now playable in your web browser for free

Filed under: Computer Software,Interesting Sites — Suramya @ 11:28 PM

Last year the Internet arcade released over 900 classic arcade games playable in a browser to the public. Not satisfied with that accomplishment they topped it by releasing the over 2400 classic DOS games to the public and as before they are all playable in your web browser. The list of games include classics like Prince of Persia, The Oregon Trail, Castle Wolfenstein and many many more. This collection sure brings back a lot of memories for me.

If you’re a PC gamer of a certain age (cough), you’ve probably lamented that many of the titles you played as a kid are hard to use on modern systems without downloading emulators or waiting for special re-releases. Well, it just got a lot easier to relive your gaming glory days. The Internet Archive’s growing collection of web-based retro games now includes roughly 2,400 MS-DOS classics

I think I am going to be spending some time ensuring that the games function correctly in a browser. Purely for verification of the work done here of-course 🙂

– Suramya

Source: engadget.com

December 28, 2014

Update/Patch multiple Windows software in one shot

Filed under: Computer Software — Suramya @ 11:53 PM

One of the many downsides of using Windows is that there is no central way of updating all software installed on the computer in one shot. In Debian I can do an ‘apt-get upgrade’ and it will upgrade all the software installed on the system to the latest versions available. Patch my PC attempts to give you the same functionality on a Windows system. It has support for around 100 different applications and it can install or update them automatically when you run a scan.

I haven’t tried it out because I don’t have a Windows machine, but it is recommended by the folks at Lifehacker so it should be stable and work as advertised.

However please use it at your own risk, I am not responsible if this manages to destroy your computer or summon an imp.

– Suramya

December 26, 2014

Writing A Virtual Machine In Excel

Filed under: Computer Software,My Thoughts,Techie Stuff — Suramya @ 6:03 PM

Microsoft Excel should soon be classified as an Operating System. In the past we have seen a 3D shoot them up Doom Clone, a Flight Simulator and other games included in it as Easter eggs. Then we saw people using it to watch movies at work, and now here’s the latest entry that forces Excel way outside its comfort zone…

A programmer named Ádám was participating in a contest where he had to solve a problem in Excel but the official rules prohibited the use of Excel macros so he went and created an assembly interpreter for Excel and used it to solve the problem instead. Talk about overkill. The idea is quite interesting. However the thought process required to imagine this as a possibility and then actually going ahead and implementing this is mind boggling.

This is a virtual Harvard architecture machine without writable RAM; the stack is only lots and lots of IFs. The instructions – mostly load, MOV, JNZ, INC, and CMP solves this problem, examining two inputs to see if they multiples of each other. If you’re wondering, an example cell from [Ádám]’s Excel sheet looks like this:

=F6
   INDEX($C$2:$C99999,$G2,1),
   IF(AND(INDEX($B$2:$B99999,$G2,1)="JZ",$I2=0),
      INDEX($C$2:$C99999,$G2,1),
         IF(AND(INDEX($B$2:$B99999,$G2,1)="JNZ",$I2<>0),
         INDEX($C$2:$C99999,$G2,1),
         G2+1
         )
      )
   )
)

You can check out Adam’s solution at Hackaday.io if you are interested. I think I am going to go find my excel for Dummies book now and get just a little bit more proficient in it.

Thanks to hackaday.com for the original article.

-Suramya

December 14, 2014

Cleaning your Linux computer of cruft and duplicate data

When you use a computer and keep copying data forward everytime you upgrade or work with multiple systems it is easy to end up with multiple copies of the same file. I am very OCD about organizing my data and still I ended up with multiple copies of the same file in various locations. This could have happened because I was recovering data from a drive and needed a temp location to save the copy or forgot that I had saved the same file under another directory (because I changed my mind about how to classify the file). So this weekend I decided to clean up my system.

This was precipitated because after my last system reorg I didn’t have a working backup strategy and needed to get my backups working again. Basically I had moved 3 drives to another server and installed a new drive on my primary system to serve as the Backup drive. Unfortunately this required me to format all these drives because they were originally part of a RAID array and I was breaking it. Once I got the drives setup I didn’t get the chance to copy the backup data to the new drive and re-enable the cron job that took the daily backup snapshots. (Mostly because I was busy with other stuff). Today when I started copying data to the new Backup drive I remembered reading about software that allowed you to search for duplicate data so thought I should try it out before copying data around. It is a good thing I did because I found a lot of duplicates and ended up freeing more than 2 GB of space. (Most of it was due to duplicate copies of ISO images and photos).

I used the following software to clean my system:

Both of them delete files but are designed for different use cases. So let’s look at them in a bit more detail.

FSlint

FSlint is designed to remove lint from your system and that lint can be duplicate files, broken links, empty directories and other cruft that accumulates when a system is in constant use. Installing it is quite easy, on Debian you just need to run the following command as root

apt-get install fslint

Once the software is installed, you can either use the GUI interface or run it from the command line. I used the GUI version because it was easier to visualize the data when seen in a graphical form (Yes I did say that. I am not anti-GUI, I just like CLI more for most tasks). Using the software was as easy as selecting the path to search and then clicking on Find. After the scan completes you get a list of all duplicates along with the path and you can choose to ignore, delete all copies or delete all except one. You need to be a bit careful when you delete because some files might need to be in more than one location. One example for this situation is DLL files installed under Wine, I found multiple copies of the same DLL under different directories and I would have really messed up my install if I had blindly deleted all duplicates.

Flossmanuals.net has a nice FSlint manual that explains all the other options you can use. Check it out if you want to use some of the advanced features. Just ensure that you have a good backup before you start deleting files and don’t blame me when you mess up your system without a working backup.

BleachBit

BleachBit is designed for the privacy conscious user and allows you to get rid of Cache, cookies, Internet history, temporary files, logs etc in a quick and easy way. You also have the option to ensure that the data deleted is really gone by overwriting the file with random data. Obviously this takes time but if you need to ensure data deletion then it is very useful. Bleachbit works on both Windows and Linux and is quite easy to install and use (at least on Linux, I didn’t try it on Windows). The command to install it on Debian is:

apt-get install bleachbit

The usage also is very simple, you just run the software and tick the boxes relevant to the clutter that you want gone and BleachBit will delete it. It does give you a preview of the files it found so that you can decide if you actually want to delete the stuff it identifies before you delete it.

Well this is all for now. Will write more later.

Thanks to How to Sort and Remove Duplicate Photos in Linux for pointing me towards FSlint and Ten Linux freeware apps to feed your penguin for pointing me towards BleachBit.

– Suramya

Older Posts »

Powered by WordPress