Suramya's Blog : Welcome to my crazy life…

January 11, 2021

Do young people not care about privacy because they grew up sharing personal information?

Filed under: My Thoughts,Techie Stuff — Suramya @ 1:03 PM

I don’t agree with statement, though there are many proponents to abolishing privacy online. Unfortunately, over the years we have been trained to give personal information in order to use services online and this is based on the principle that our data has no value and if we have nothing to hide we should be ok to share data online. This is a fallacy.

Having privacy is essential to us as humans because it gives us a judgment free space where we can be ourselves without worrying about what others might think. This allows us to explore unpopular ideas without worry or fear. Plus it is not just people who have unpopular ideas that need privacy. There is a lot of stuff that I wouldn’t want to share with everyone even if it’s nothing illegal or unpopular such as details about my health or personal finances. Privacy doesn’t mean that we don’t want to share information, it just means that I get to choose who has access to data about me.

In certain scenarios privacy helps protect us physically from stalkers or people who mean us harm, think fanatic fans or people fleeing an abusive relationship. Such people would not want their physical location broadcast to the world. There was a case a few months ago where a fan used a photo published by a star to locate her apartment and assaulted her at her apartment. This was a breach of privacy of the star and caused major distress to her. Now imagine if we could immediately find where anyone in the world is located at a given time, this will allow us to determine so much other data about that person such as their health information (if they are visiting a cardiac doctor every week, there is a high probability they are suffering from a hear problem). A few years ago Uber analyzed data from the rides people were taking using it’s service and used that to figure out who was having an affair with whom based on the rides they took and the location they were dropped. It was quite a scandal when it came out and now imagine someone taking this information and blackmailing people.

With the amount of information that we are giving to websites and companies both voluntarily and involuntarily privacy is becoming harder to maintain but that doesn’t mean that we give up and let companies do whatever they want with our data. If we do that then be prepared to have every aspect of your life dissected and analyzed for profit.

Too many people state that they have nothing to hide and have no problems with having their information public. I challenge them to stay in a house completely made of glass (including the bathrooms) and have a bot that publishes all emails/messages/transcripts from their calls publicly. I can bet there won’t be any takers. as everyone has something they wouldn’t want to be public knowledge.

– Suramya

January 10, 2021

What are the ethical obligations of a Computer professional?

Filed under: My Thoughts — Suramya @ 11:58 PM

This is a question that is getting a lot of attention right now. A lot of people say that Technologists shouldn’t be political or worry about how their tech being used but I believe that it is wrong. This school of thought comes from when historically people working on computers were not impacting any real-world events/consequences. E.g. if a computer crashed or was hacked it might not have a life & death impact. Now with everything connected to each other including devices that effect the physical world that is not the case. So if there is a major flaw in a control system of a car that allows it to be hacked then it can be used to crash a car or stop it in the middle of the road causing a pileup. If there is a vulnerability found in a pace maker then it can be used to kill people.

Due to this all our work needs to take in account all three scenarios. We can’t just create a system that causes extensive harm and claim that it was done because “This is what I was told to do” is not a valid justification for doing something that is used to harm people/communities. Some claim that our job is to help our companies & clients make money and leave the ethics at home but it is not the correct way to look at things.

To take an example, what if I develop hacks that allowed governments to spy on terrorists undetected by monitoring their phones & computers? That seems like a win-win for all correct?I am helping stop terrorists and keeping the world safe. What else do we need? Now what if those same hacks were repurposed by repressive regimes to spy on their dissidents? It shows that everything we do and work on, has consequences some of which are intended & some are not.

In another example, if I figure out a way to remotely identify anyone even when they are masked, then before I release the software I also need to think of how else the software might be used. Will it be used to target protestors or political dissidents? We need to figure out what other uses it might have and then take a call.

Unfortunately there is no clear answer or a checklist that we can follow to make the correct decision. End of the day we need to make a decision and then live with the consequences.

– Suramya

January 9, 2021

Online Afterlives: Chatting with the dead

Filed under: My Thoughts — Suramya @ 2:56 PM

Dealing with death is something that everyone struggles with and with the digital aspects of life becoming more and more prevalent there are many ways folks try to keep the memory of their loved ones accessible. There are options in major social media sites to memorialize an account after the owner passes away, others use personal websites to memorialize their loved ones. With advances in technology there is now a new way to remember your loved ones, using AI and machine learning there are companies that allow you to ‘chat’ with your loved ones even after they have died. Basically these sites train a machine learning model using existing communication, emails, chats, postings etc to give you the impression that you are chatting with a dead person.

In theory this is very interesting and I like the use of technology to ease the sense of loss from a death. However, my concern is that this can quickly become a crutch and for people who are having a hard time letting go, this can make things even more complicated. In the end this is a chat-bot pretending to be a person, although to be fair the bot is explicit in telling people that it is a bot. For example when asked where they were the bot responded: “As a bot I suppose I exist somewhere on a computer server in San Francisco. And also, I suppose, in the minds of people who chat with me.”

Overall I am not sure how I feel about the tech. It is both good and creepy at the same time. Extrapolating into the future we can see that soon it would be possible to create a virtual reality (VR) representation of a person that can interact with people after the original person dies. All you would need is data and enough processing power to create a model of how a person behaves. There was a TV show that I saw a while ago where dead people lived on as online avatars and would still interact with their loved ones but I can’t remember the name but it was an interesting concept.

What do you think?

Source: Popsci.com: Old text messages are letting people chat with the dead

– Suramya

January 8, 2021

Idiot threatens to kill co-worker because his friend request wasn’t accepted

Filed under: My Thoughts — Suramya @ 4:52 PM

A while ago I had posted about the most bizarre reaction I had seen from someone whose friend request I had not responded to quickly (I don’t check FB very often). I am happy to say that it was nothing like the reaction this person from North Dakota, US got. Apparently 29 year old Caleb Burczyk decided that being friends with his ex-coworker was so important that Caleb threatened to kill them if his request wasn’t accepted. To top things off, this moron actually went over to their house and kicked in the front door while getting caught on camera doing so just because he couldn’t take the fact that his friend request was rejected ensuring that he has a felony assault entry in his record.

Caleb Burczyk, 29, pleaded not guilty to felony charges of burglary and terrorizing filed in Williams County District Court Tuesday, Dec. 29. Burczyk’s attorney Jeff Nehring declined to comment on the case.

Police say Burczyk started sending aggressive Facebook messages to his ex-coworker on Dec. 24, according to an affidavit of probable cause. He threatened his ex-coworker’s life and warned him that he was going to “come at” him if he did not accept his Facebook friend request, the affidavit stated.

“Accept my friend request or I’m going to murder you,” Burczyk wrote in a message to his ex-coworker, according to the affidavit.

I can understand being slightly upset when someone doesn’t want to be friends with you but this is extreme and the guy should be in jail as he is not stable and could have caused a lot of harm. These are the kind of entitled morons who need to be taught that the world doesn’t revolve around them and they need to accept that people don’t want to associate with them. I don’t blame the co-worker for not wanting to be friends if this is how he behaved. I am sure he was doing the same thing in real life as well.

Source: PSA: If Someone Doesn’t Accept Your Friend Request, Do Not Threaten To Kill Them And Kick In Their Front Door

– Suramya

January 7, 2021

Welcoming 2021 in style at Rajakkad Estate, Dindigul

Filed under: My Life,Travel/Trips — Suramya @ 5:06 AM

One of the biggest things that I missed in 2020 was traveling. Usually we travel to multiple places over the year but in 2020 it wasn’t the case due to Covid and we barely traveled anywhere. So, we wanted to do a trip for new years because we were getting stir crazy and we wanted to celebrate with a small group of close friends without having to interact with unknown folks. After a bit of research we decided to head down to Rajakkad Estate, Dindigul. This is an 18th century palace that was transported from Kerala by breaking it down to 35000 pieces and re-assembled in Dindigul in the middle of an 80 acre estate. The reviews of the place were amazing so we booked the place and prepared for the journey. Due to Covid we had to register for an e-pass as we were crossing into Tamil Nadu and the pass was auto-approved. Interestingly, there was no check for the epass but I don’t recommend traveling without it because if you don’t have the pass then you are sent back.

The trip started early morning (5:30am) on the 31st with me, Jani, Ayush & Akanksha in my car and Shashank on his bike. Just as we left home it started raining and it continued to rain throughout the trip. Due to which we had to drive at a slower speed and we also ended up stopping a few times due to the heavy rain so that Shashank could take a break from driving in the rain. Thus the trip which was estimated to take ~7 hours took us over 9.5 hours to complete. The last part of the drive was amazing with spectacular views but I didn’t get to enjoy them much as I had to focus on the road and I didn’t want to drive off the cliff while admiring the view. 😉

We finally reached the estate around 3pm and found that we 5 were the only guests over there for the duration, so we got to select our rooms. Me and Jani selected a corner room with spectacular views of the forest. The host Robesh walked us through the place and once we settled in we immediately requested for lunch as breakfast was a long time ago and this time we couldn’t stop for snacks as we usually do due to the rains. Lunch was quite good and every item in the menu was locally sourced from the estate’s own farm. Can’t comment about the non-veg dishes but the veg options were quite tasty and healthy. I do recommend that you stick with the south-Indian food options (they do make continental food but we didn’t try any) as the cooks are not that great with north Indian dishes (especially chappati’s). Other than that one dish most of the dishes were quite good. I even enjoyed the banana flower vegetable which is something I have never liked.


Hogging on great food

After lunch we thought about exploring the surrounding area a bit but it was still raining so we just relaxed in the sitting area and played a bunch of board games (they have a good collection). It had been a while since we all got together so it was good to catch-up with each other’s lives. Thanks to the rain the temperature dropped quite significantly and Jani was quite thankful for her electric heated jacket. We asked if we could get a heater setup in the area but apparently there was no power outlet (15A) that could take the load of the heater so after braving the cold for a bit we moved the gathering to the room where it was a lot more warmer after we closed the windows and the door. (If you are someone who doesn’t like the cold make sure you pack warm clothes as it gets quite cold in the evening as the place is on a hill)


The Central courtyard of the palace

We had initially planned to ring in 2021 with a bonfire, but thanks to the rain we thought that it wouldn’t be possible. However, the staff went out of their way to get a bonfire started for us and we got to sit outside next to the fire to welcome 2021. It was completely unexpected and a very pleasant surprise. After enjoying the fire for a bit we moved back to the room due to the cold and spent another couple of hours just chilling. We couldn’t stay up too late because all of us were tired as we had an early start so we crashed.


Welcoming the New Year with a bonfire


Cheers to having a fantastic 2021

The next day, I woke up at 8am for some reason and then couldn’t go back to sleep, so I spent a very pleasant morning walking around the surrounding woods and exploring the lovely garden. Once everyone else woke up we had a great breakfast with fresh juice, homemade bread, south Indian dishes and eggs. Thankfully it had stopped raining and the weather was very pleasant so we decided to explore the surrounding area and walked over to the Yoga platform which is built into the side of the hill with a spectacular view of the hills and the sun-rise (if you wake up early enough). We didn’t wake up that early so we just sat there for a bit enjoying the view and listening to Jani talk about all the flowers that she could identify (which were a lot). We then went for an hour long walk in the forest and saw a whole bunch of flora & fauna. The walk helped us build up an appetite and we were reading to do justice to the lunch prepared for us. It was served in an outdoor seating area and we all really enjoyed the food and the location.


Jani and me at the front garden


Freshly plucked tamarind directly from the tree


Group Selfie at the estate


Chilling at the Yoga platform

Post lunch we relaxed for a bit and Robesh suggested we checkout a waterfall nearby so we all drove for about 15 mins to the entrance to the trail and then walked down to the fall. It was a nice walk with a rope bridge on the way that we had to cross. It was amusing to watch some of the folks crossing the bridge (I am not going to name names) as they were quite scared. We did see a couple of giant squirrels on the way along with a whole bunch of birds and that was quite nice. Once we got back we remembered that we had brought badminton racquets so we played for a bit till it became too dark to see, post which we had dinner and another round of games, great conversation and relaxed. There is not a lot to do at this place so be prepared to entertain yourselves. The phone and 4G signal was quite spotty in the property and the WiFi was down thanks to the rain so we got to spend time without the constant distraction of the online world.


Enjoying the waterfall view

Next day all of us were up early as we had planned to leave immediately post breakfast so that we could reach home before dark. The breakfast was served in the open at a fantastic open air seating and we enjoyed great food with the sounds of nature as background music. Post breakfast we finished our packing and started back to Bangalore relaxed and rejuvenated.

However, the trip had some additional surprises in store for us. After about an hour of driving I realized that my car’s AC Fan had stopped working, and the temperature soon became too hot for me to handle (though Jani loved it). We ended up driving with the windows down, which was fine when on the highway but everytime we had to slow down due to tolls or traffic I was quite miserable. The return journey took us almost 11 hours due to the more frequent stops and traffic.


Breakfast in the forest

By the time we reached home I was exhausted and ended up crashing immediately. We drove ~850 kms round trip and it was worth every minute as the trip was a lot of fun with good food & stay, great company and a fantastic way to welcome 2021!

Wishing you all a Very Happy New Year!

Will write more later.

– Suramya

November 28, 2020

My Backup strategy and how it has evolved over the years

I am a firm believer in backing up my data, some people say that I am paranoid about backing up data and I do not dispute it. All my data is backed up on multiple drives and locations and still I feel that I need additional backup. This is because I read the news and there have been multiple cases where people lost their data because they hadn’t backed it up. Initially I wasn’t that serious about it but when I was in college and working at the helpdesk, a phd student came in crying because her entire PHD thesis was on a Zip Drive and it wasn’t working anymore. She didn’t have a backup and was basically screwed. We tried a bunch of stuff to recover the data but didn’t manage to recover anything. That made me realize that I needed a better backup procedure so started my journey in creating recoverable backups.

My first backup system was a partition on my drive called backup where I created a copy of all my important data (This is back in 2000/2001). Then I realized that if the drive died then I would loose access to the backup partition as well, and I started looking for alternatives. This is around the time when I had bought a CD Writer so all my important data was backed up to CD’s and I was confident that I could recover any lost data. Shortly afterwards I moved to DVD’s for easier storage. However, I didn’t realize till a lot later that CD’s & DVD’s start becoming unreadable quite easily. Thankfully I didn’t loose any data but it was a rude awakening to find that the disks I had expected to keep my data safe were starting to become unreadable within a few years.

I then did a bunch of research online and found that the best medium for storing data long term is still Hard Drives. I didn’t want to store anything online because I want my data to be in my control so any online backup system was out of the question. I added multiple drives to my desktop and started syncing the data from the desktop & laptop to the backup drive using rync. This ensured that the important data was in three locations at any given time: My Desktop, My Laptop and the Backup drive. (Plus a DVD copy that I made of all my data every year)

I continued with this backup strategy for a few years but then realized that I had no way to go back to a previous version of any given document, if I deleted a file or wanted to go back to an older version of a file I only had 24 hours before the changes were synced to the backup drive before it was unrecoverable. There was a case where I ended up having to dig through my DVD backups to find the original version of a file that I had changed. So I did a bit of research and found rdiff-backup. It allows a user to back up one directory to another and generates an incremental backup. So we can recover/restore files based on date range. The best part is that the software is highly efficient, once the initial backup is done it only transmits the changes to the files in subsequent runs. Now that I have been using it I can restore a snapshot of my data going back to 2012 quite easily.

I was quite happy with this setup for a while, but while reading an article on best backup practices I realized that I was still depending only on 1 location for the backup data (the rdiff-data snapshots) and the best practices stated that you should also store it in an external drive or offsite location to prevent viruses/ransomware from deleting backups. So I bought a 5TB external drive and created an encrypted partition on the same to store all my important data. But I was still unhappy because all of this was still stored at my home so if I had a fire or something I would still end up loosing the data even though my external drive was kept in a safe. I still didn’t want to store data online but that was still the best way to ensure I had offsite backup. I initially thought about setting a server at my parents place in Delhi and backup there but that didn’t work out for various reasons. Plus I didn’t want to have to call them and troubleshoot backup issues over the phone.

Around this time I was reading about encrypted partitions and came up with the idea of creating an encrypted container file to store my data and then backup the container file online. I followed the steps I outlined in my post How to encrypt your Hard-drive in Linux and created the encrypted container. Once I finished that I had to upload the container to my webhost since I had unlimited storage space as per my contract. Initially I wasn’t able to because they had restricted my account’s quota but a call to their customer support sorted it out after a bit of argument and explaining what I was doing. The next hurdle I faced was uploading the file to the server because of the ridiculously low upload speed I was getting from Airtel. I had a 40 mbps connection at the time but the upload speed was restricted to 1 mbps because of ‘reasons’. After arguing with their support for a while, I was complaining about it at work and one of the folks suggest I check out ACT Internet. I checked out their plans and was quite impressed with the offerings so I switched over to ACT and was able to upload the container file quickly and painlessly.

Once the container was uploaded, I had to tackle the next problem in the process which was on how to update the files in the container without having to upload the entire container to the host. I experimented with a few solutions and then came up with the following solution:

1. Mount the remote partition as a local mount using sshfs. I mounted the partition locally using the following command: (please replace with the correct hostname and username before using)

/usr/sbin/runuser -l suramya -c "sshfs -o allow_other @hostname.com:. /mnt/offsite/"

2. Once the remote partition was mounted locally, I was able to use the usual commands to mount the encrypted partition to another location using the following command:

/usr/sbin/cryptsetup luksOpen /mnt/offsite/container/Enc_vol1.img enc --key-file /root/UserKey.dat
mount /dev/mapper/enc /mnt/stash/

In an earlier iteration of the code I wasn’t using the keyfile so had to manually enter the password everytime I wanted to backup to the offsite location. This meant that the backup was done randomly as and when I remembered to run the command manually. A few days ago I finally configured it to run automatically after adding the keyfile as a decryption key. (Obviously the keyfile should be protected and not be accessible to others because it allows users to decrypt the data without entering a password.) Now the offsite backup runs once a week while the local backup runs daily and I still backup the Backup partition to the external drive as well manually as and when I remember to do so.

In all I was quite happy with my setup but then I was updating the encrypted container and a network issue made be believe that my remote container had become corrupted (it wasn’t but I thought it was). At the same time I was fooling around with Microsoft One Drive and saw that I had 1TB of storage available over there since I was a Office 365 subscriber. This gave me the idea of backing up the Container to OneDrive as well as my site hosting.

I first tried copying the entire container to the drive and hit a limit because the file was too large. So I thought I would split the file into 5GB parts and then sync them to OneDrive using rclone. After installing rclone. I configured it to connect to OneDrive by issuing the following command and following the onscreen prompts:

rclone config

I then created a folder on OnDrive called container to store the split files and then tried uploading a test file using the command:

rclone copy $file OneDrive:container

Where OneDrive is the name of my provider that I configured in the previous step. This was successful so I just needed to create a script that did the following:

1. Update the Container file with the latest backup
2. Split the Container file into 5GB pieces using the following command:

split --verbose -d -b5GB /mnt/repository/Container/Enc_vol1.img /mnt/repository/Container/Enc_vol_

3. Upload the pieces to Ondrive.

for file in `ls /mnt/repository/Container/Enc_vol_* |sort`; do  echo "$file";  /usr/bin/rclone copy $file OneDrive:container -v &> /tmp/oneDriveSync.log; done

This command uploads the pieces to the drive one at a time and is a bit slow because it maxes out the upload speed to ~2mbps. If you split the uploads and run the command in parallel then you get a lot faster speed. Keep in mind that if you are uploading more than 10 files at a time you will start getting errors about too many open connections and then you have to wait for a few hours before you can upload again. It took a while to upload the chunks but now my files are stored in yet another location and the system is configured to sync to Onedrive once a month.

So, as of now my files are backed up as following:

  • /mnt/Backup: Local Drive. All changes are backed up daily using rdiff-backup
  • /mnt/offsite: Encrypted Container stored online. All changes are backed up weekly using rsync
  • OneDrive: Encrypted Container stored at Microsoft OneDrive. All changes are backed up monthly using rsync
  • External Drive: Encrypted backup stored in an External Hard-drive using rsync. Changes are backed up infrequently manually.
  • Laptop: All Important files are copied over to the laptop using Unison/rsync manually so that I can access my data while traveling

Finally, I am also considering backing up the snapshot data to BlueRay disks but it will take time so haven’t gotten around to it yet.

Since I have this elaborate backup procedure I wasn’t worried much when one of my disks died last week and was able to continue work without issues or worries about loosing data. I still think I can enhance the backups I take but for now I am good. If you are interested in my backup script an extract of the code is listed below:

function check_failure ()
{
	if [ $? == 0 ]; then
		logger "INFO: $1 Succeeded"
	else
		logger "FATAL: Execution of $1 failed"
		wall "FATAL: Execution of $1 failed"
		exit 1
	fi
}

###
# Syncing to internal Backup Drive
###

function local_backup ()
{
	export BACKUP_ROOT=/mnt/Backup/Snapshots
	export PARENT_ROOT=/mnt/repository

	logger "INFO: Starting System Backup"

	rdiff-backup -v 5 /mnt/data/Documents/ $BACKUP_ROOT/Documents/
	check_failure "Backing up Documents"

	rdiff-backup -v 5 /mnt/repository/Documents/Jani/ $BACKUP_ROOT/Jani_Documents/
	check_failure "Backing up Jani Documents"

	rdiff-backup -v 5 $PARENT_ROOT/Programs/ $BACKUP_ROOT/Programs/
	check_failure "Backing up Programs"

	..
	..

	logger "INFO: All Backups Completed Successfully."
}

### 
# Syncing to Off-Site Backup location
###

function offsite_backup
{
	export PARENT_ROOT=/mnt/repository

	# First we mount the remote directory to local
	logger "INFO: Mounting External Drive"
	/usr/sbin/runuser -l suramya -c "sshfs -o allow_other username@remotehost:. /mnt/offsite/"
	check_failure "Mounting External Drive"

	# Open the Encrypted Partition
	logger "INFO: Opening Encrypted Partition. Please provide password."
	/usr/sbin/cryptsetup luksOpen /mnt/offsite/container/Enc_vol1.img enc --key-file /root/keyfile1
	check_failure "Mounting Encrypted Partition Part 1"

	# Mount the device
	logger "INFO: Mounting the drive"
	mount /dev/mapper/enc /mnt/stash/
	check_failure "Mounting Encrypted Partition Part 2"

	logger "INFO: Starting System Backup"
	rsync -avz --delete  /mnt/data/Documents /mnt/stash/
	check_failure "Backing up Documents offsite"
	rsync -avz --delete /mnt/repository/Documents/Jani/ /mnt/stash/Jani_Documents/
	check_failure "Backing up Jani Documents offsite"
	..
	..
	..

	umount /mnt/stash/
	/usr/sbin/cryptsetup luksClose enc
	umount /mnt/offsite/

	logger "INFO: Offsite Backup Completed"
}

This is how I make sure my data is backed up. All of Jani’s data is also backed up to my system using robocopy as she is running Windows and then the data gets backed up by the scripts I explained above as usual. I also have scripts to backup my website/blog/databases but that’s done using a simple script. Let me know if you are interested and I will share them as well.

This is all for now. Let me know if you have any questions about the backup strategy or if you want to make fun of me. 🙂 This is all for now. Will write more later.

– Suramya

October 16, 2020

Response to a post that insists that you should ‘Focus on your Job not side projects’

Filed under: My Thoughts,Techie Stuff — Suramya @ 11:44 AM

I found this post while surfing the web, and the main point of the post is to tell people that they should stop focusing on their side projects because the recruiters would not be interested and what matters in getting a job is what your current company name is. He also recommends dropping the side projects and read “Cracking the code interview” instead to learn everything you need to know about algorithms and binary trees so that you get a job. There are so many things in the post that I disagree with that it was hard for me to figure out where to start.

Let me start off by saying that having a cool portfolio will not necessarily get you a job as there is an element of luck involved. You do need to know how to crack an interview so do read through the Cracking the Code Interview, How to Interview etc. I will not go through a list of do’s and donts for interview’s here as that is not the purpose of this post but basically you need to show that you are competent in the skill set they are looking for and not a problem person to work with. (Basically you need to leave your ego at home). That being said, there are enough candidates in the market looking for a job and you need something that will differentiate you from the rest of the crowd. That’s where your side projects come in.

I am going to quote some of the more problematic portions of the post here and then respond to make it easier for people to follow my reasoning. So lets dig in.

First, most recruiters don’t care about your personal projects or how many meetups you went during the year. What matters the most is your current company – and by that I mean the name of your current company. It was saddening me before, but now that I’m on the other side today, with a manager position, I better understand this. This is plain common sense. You can generally assume that a developer coming from a cutting-edge company has better chances to be a great developer than a developer coming from a Java 1.4 shop. He may not be smarter, but he has been hired by a company with a most demanding hiring process, and has been surrounded by some of the smartest developers.

I completely disagree with this. (I will be using recruiters to mean Tech Recruiters who are basically head hunters for a firm but not the people who will be working with you.) Recruiters are not there to talk to you about your personal projects, they are there to assess your fit into the skillset that the sourcing company is asking for, if you are a match for the skills then they will move you to the next level where you interview with the Hiring Manager or go through a Technical Interview. If you are not a fit then it doesn’t matter if you have a million side projects, they will not proceed with the interview. One way side projects help in such a scenario is to allow you to prove you have the skills in a particular domain even though you haven’t worked on it in a professional capacity.

Coming to the second point, using the current company as a hiring criteria is one of the most idiotic things I can think of for screening people. I have worked in Goldman Sachs, Sprint & Societe Generale and as with everywhere there were some employees in each company which made you think “How on earth did they get hired here?” and this is after a seriously demanding set of interviews to join the firm (I had 9 interviews for Goldman). Just because they work at a company doesn’t mean they are the best fit for your requirement. Secondly no company is uniform, so it is guaranteed that there will be parts of the company working with cutting edge while other teams will be on antique systems. In one of my previous companies (not going to name them here 🙂 ) there was a team using Git & the latest software stack for building their releases and another team that used RCS and tooling around it to build their software.

Assuming that the entire company is on the same stack is a mistake especially when talking about large companies. In small to medium companies this might not be the case always but even there, it is possible that there is a legacy system that is not changed/upgraded and people are working on it. Forget latest systems, a lot of the major banks still have Mainframes running critical portions of their software and other parts of the bank which use AI/ML for their projects.

Yes, there is a certain quality that is assumed when interviewing a person from a famous company but it is not what I am basing my hiring on, you will be hired on your skills not your past job experience. Basically in my opinion your past jobs can get you in the door for the interview but passing it is up to your skills & attitude. You should try to use the side projects as a way to showcase your skills. e.g. if you created a super cool way of doing x with a new technology it will do more to showcase your skill than stating that you did coding from 9-5.

Worse, having too many personal projects can raise a flag and be scary for the recruiter.

I have never had this happen and I was the guy with a ridiculous no of side projects through out my career. Most of the skills I have are from trying out new technology at home and since just reading a book on it doesn’t make you proficient I would end up using the tech for my next project giving me experience in working on the tech. In fact I have found my side projects to be a great benefit when interviewing because most technical interviewers are techies themselves and it can be fun to discuss such projects with them. I remember one particular interview where I mentioned one of my side projects (email to SMS bridge) during the interview and then actually spent about 20 mins talking about the applications for it and how it could be improved. It played a big part in why I was hired for the role.

If a company is scared that you are working on stuff outside their work areas then I don’t think that it is a company that you would want to work with in any case. At least I wouldn’t want to work for such a company.

My CTO experience was an anomaly, at best two lost years, at worst a sign that I was too independent, too individualistic, not a good team player. Only relatively small and ambitious startups, like the one I’m in today, were valuing this experience.

Again I must disagree. When you work in a startup you learn a lot and get to explore areas outside of what you are officially supposed to be doing. This is a great benefit when working in the normal big companies because you now know how the other parts of the software/hardware stack work and can use that to identify issues before they become a problem.

However, one point I do want to stress is that if you started a company right out of college and became a CTO in it, then it will not be given as much weightage as if you had done it after a bit of industry experience. I worked with a startup in my previous company where the entire teams combined work experience was less than mine and it was quite apparent in how they worked. For example they were very casual about releases and if they managed to finish an extra feature before the release even though it wasn’t tested they would go ahead and release it without notifying us. But the drive they brought into the project was something else. I was blown away by their push to ensure that their software did everything we asked it to.

The best way to dig a new technology is to practice it in your daily job. You’ll spend seven hours a day on it and will quickly become infinitely more proficient than if you just barely used it on nights and weekends. You may tell me that we face a chicken or egg problem here. How to get a job where you’ll work on a really attractive technology if you never used it before? Well, instead of spending nights superficially learning this technology, spend your nights preparing interviews. Read “Cracking the code interview”, learn everything you need to know about algorithms and binary trees. As we all know, the interview process is broken. Instead of deploring it, take advantage of it.

Unless you are very lucky you will hardly ever be working on cutting edge tech at your day job. Companies don’t want to experiment with new untested technologies for their production systems, they want something rock solid. If you are lucky you will get a few hours a week to try out a new tech to evaluate it and then a few months/years before they put it in production (depends on the company).

In summary I would like to say that Side projects can be a big benefit while searching for a job but you also need to ensure you don’t neglect the other parts of your profile like communication skills, leadership skills, team work etc. If you have a very strong skillset and you are using side projects to expand your skills then you should be good for most companies.

Well this is all for now. Will write more later.

– Suramya

October 15, 2020

Spinach can power up fuel cells in addition to Popeye

Filed under: Emerging Tech — Suramya @ 11:43 PM

A lot of us grew up with watching Popeye get a power boost from eating Spinach, now thanks to the research done at American University we found that spinach can also be used to give fuel cell’s a boost. Historically we have used platinum based catalysts in fuel cells but since platinum is very expensive & hard to obtain teams have been looking for alternatives. They found that due to the high Iron & nitrogen content of Spinach they were able to create a viable Catalyst.

To prepare the catalyst, you need to wash the leaves & pulverize into a juice followed by freeze drying the result. This frozen juice is ground into a powder, melamine and salts like sodium chloride & potassium chloride are added. After this the composite is pyrolyzed at 900 C a couple of times resulting in the catalyst. The results so far have been quite promising but there still needs to be a lot more research done to see if this is viable when done at a commercial scale. The biggest advantage of using Spinach is that it is a renewable & sustainable source of biomass.

Biomass-derived porous carbon materials are effective electrocatalysts for oxygen reduction reaction (ORR), with promising applications in low-temperature fuel cells and metal–air batteries. Herein, we developed a synthesis procedure that used spinach as a source of carbon, iron, and nitrogen for preparing porous carbon nanosheets and studied their ORR catalytic performance. These carbon sheets showed a very high ORR activity with a half-wave potential of +0.88 V in 0.1 M KOH, which is 20 mV more positive than that of commercial Pt/C catalysts. In addition, they showed a much better long-term stability than Pt/C and were insensitive to methanol. The remarkable ORR performance was attributed to the accessible high-density active sites that are primarily from Fe–Nx moieties. This work paves the way toward the use of metal-enriching plants as a source for preparing porous carbon materials for electrochemical energy conversion and storage applications.

The next step in the process is to create a fuel cell using this catalyst and the team is exploring collaboration options with other research groups.

Source: Spinach Gives Fuel Cells a Power Up

– Suramya

October 14, 2020

Walking around in a Cell using Virtual Reality

Filed under: Computer Hardware,Emerging Tech,Techie Stuff — Suramya @ 11:59 PM

It’s hard to view 3D data on a 2D screen efficiently which is why Virtual Reality (VR) & Augmented Reality (AR) have so many fans as they allow us to interact with data in 3D, making it more intuitive and easier to process (for some use cases). Now there is another application for VR that actually makes sense and is not just hype. Researchers at University of Cambridge & Lume VR Ltd have managed to convert super-high resolution microscopy data into a format that can be visualized in VR.

Till 2014 it was assumed that we could never obtain a better resolution than half the wavelength of light. The Nobel Laureates in Chemistry 2014 managed to work around this limitation creating a new field called Super-resolution microscopy that allows us to obtain images at nanoscale. This enables us to see the individual molecules inside cells to track proteins involved in various diseases or watch fertilized eggs as they divide into embryos. Combining this with the technology from Lume VR allows us to visualize and interact with the biological data in real time.

Walking through the cells gives you a different perspective and since the data is near real time it allows us to literally watch the cell’s reaction to a particular stimuli. This will have massive implications for the Biomed/BioTech fields. Maybe we can use it to figure out why organ rejections happen or what causes Alzheimer’s.

“Data generated from super-resolution microscopy is extremely complex,” said Kitching. “For scientists, running analysis on this data can be very time-consuming. With vLUME, we have managed to vastly reduce that wait time allowing for more rapid testing and analysis.”

The team is mostly using vLUME with biological datasets, such as neurons, immune cells or cancer cells. For example, Lee’s group has been studying how antigen cells trigger an immune response in the body. “Through segmenting and viewing the data in vLUME, we’ve quickly been able to rule out certain hypotheses and propose new ones,” said Lee. This software allows researchers to explore, analyse, segment and share their data in new ways. All you need is a VR headset.”

Interestingly vLUME is available for download as an Open Source program from their Git repository. The program is free free-for-academic-use. Check it out if you are interested in how it works.

Source: New virtual reality software allows scientists to ‘walk’ inside cells

– Suramya

October 13, 2020

It is now possible to generate clean hydrogen by Microwaving plastic waste

Filed under: Emerging Tech,Interesting Sites,My Thoughts — Suramya @ 2:33 PM

Plastic is a modern hazard and Plastic Pollution has a massive environmental impact. As of 2018, 380 million tonnes of plastic is being produced worldwide each year (source: Wikipedia). Since we all knew that plastic was bad a lot of effort was put in to get people to recycle plastics and single use plastics have been banned in a lot of places (In India they are banned as of 2019). However as per the recent report by NPR, recycling doesn’t keep plastic out of landfills as it is not economically viable at a large scale. It is simply cheaper to just bury the plastic than to clean it and recycle. Apparently this has been known for years now but the Big Oil companies kept it quite to protect their cash cow. So the hunt of what to do with the plastic continues and thanks to recent breakthroughs there just might be light at the end of this tunnel.

Apparently plastic has a high density of Hydrogen in it (something that I wasn’t aware of) and it is possible to extract this hydrogen to use as fuel for a greener future. The existing methods involve heating the plastic to ~750°C to decompose it into syngas (mixture of hydrogen and carbon monoxide) which are then separated in a second step. Unfortunately this process is energy intensive and difficult to make commercially viable.

Peter Edwards and his team at the University of Oxford decided to tackle this problem and found that if you broke the plastic into small pieces with a kitchen blender and mixed it with a catalyst of iron oxide and aluminium oxide, then microwaved it at 1000 watts then almost 97 percent of the gas in the plastic was released within seconds. To cherry on top is that the material left over after the process completed was almost exclusively carbon nanotubes which can be used in other projects and have vast applications.

The ubiquitous challenge of plastic waste has led to the modern descriptor plastisphere to represent the human-made plastic environment and ecosystem. Here we report a straightforward rapid method for the catalytic deconstruction of various plastic feedstocks into hydrogen and high-value carbons. We use microwaves together with abundant and inexpensive iron-based catalysts as microwave susceptors to initiate the catalytic deconstruction process. The one-step process typically takes 30–90 s to transform a sample of mechanically pulverized commercial plastic into hydrogen and (predominantly) multiwalled carbon nanotubes. A high hydrogen yield of 55.6 mmol g−1plastic is achieved, with over 97% of the theoretical mass of hydrogen being extracted from the deconstructed plastic. The approach is demonstrated on widely used, real-world plastic waste. This proof-of-concept advance highlights the potential of plastic waste itself as a valuable energy feedstock for the production of hydrogen and high-value carbon materials.

Their research was published in Nature Catalysis, DOI: 10.1038/s41929-020-00518-5 yesterday and is still in the early stages. But if this holds up at larger scale testing then it will allow us to significantly reduce the plastic waste that ends up in landfills and at the bottom of the ocean.

Source: New Scientist: Microwaving plastic waste can generate clean hydrogen

– Suramya

Older Posts »

Powered by WordPress