Suramya's Blog : Welcome to my crazy life…

February 13, 2018

Explaining HTTPS using carrier pigeons

Filed under: Interesting Sites,Security Tutorials,Techie Stuff — Suramya @ 7:07 PM

HTTPS is something that a lot of people find hard to explain without going into a lot of technical jargon which frankly just confuses most people and causes them to zone out. However it is an essential service/protocol so understanding it is a good idea. To address this issue Andrea Zanin who is a student created the following primer that explains how HTTPS works using carrier pigeons as the messengers.

Below is an explanation on how HTTP would work with carrier pigeons:

If Alice wants to send a message to Bob, she attaches the message on the carrier pigeon’s leg and sends it to Bob. Bob receives the message, reads it and it’s all is good.

But what if Mallory intercepted Alice’s pigeon in flight and changed the message? Bob would have no way of knowing that the message that was sent by Alice was modified in transit.

This is how HTTP works. Pretty scary right? I wouldn’t send my bank credentials over HTTP and neither should you.

Check out the link for the full writeup.

Well, this is all for now. Will write more later.

– Suramya

February 7, 2018

Hacking the Brainwaves Cyber Security CTF Hackathon 2018

Earlier this year I took part in the Brainwaves Cyber Security Hackathon 2018 with Disha Agarwala and it was a great experience. We both learnt a lot from the hackathon and in this post I will talk about how we approached the problems and some of our learning’s from the session.

Questions we had to answer/solve in the Hackathon:

  • Find the Webserver’s version and the Operating system on the box
  • Find what processes are running on the server?
  • What fuzzy port is the SSH server running on?
  • Discover the site architecture and layout.
  • Describe the major vulnerability in the home page of the given website based on OWASP TOP 1. Portal Url:
  • Gain access to member area and admin area through blind sql, or session management.
  • Dump all user account from member area. [SQLi]
  • [Broken Validation] Demonstrate how you can modify the limit in order management.
  • [Open Redirect] Redirect site/page to
  • List any other common bug came across while on the site
    • After logging into the member area, perform the following functions:
    • Find the master hash & crack it
    • Dump all user’s
    • Find the email ID and password of saved users

Information Gathering:

In order to find the services running on the server, the first thing we had to do was find the IP/hostname of the actual server hosting the site which was a bit tricky because the URL provided is protected by CloudFlare. So, any scans of took us to the CloudFlare proxy server instead of the actual server which was a problem.

We figured this out by trying to access the IP address that translated to in the browser.

suramya@gallifrey:~$ host has address 

Since the site homepage didn’t do anything except display text that refreshed every 15 seconds we needed to find other pages in the site to give us an a attack surface. We checked to see if the site had a robots.txt (It tells web crawlers not to index certain directories). These directories are usually ones that have sensitive data and in this case the file existed with the following contents:

# robots.txt
User-agent: *
Disallow: images
Disallow: /common/
Disallow: /cgi-bin/

The images directory didn’t have any interesting files in it but the /common/ directory on the other hand had a file named embed.php in it which basically ran a PHP Info dump. This dump has a lot of information that can be used to attack the site but the main item we found here was the IP address of the actual server where the services were running (

Using this information we were able to initiate a nmap scan to get the services running on the site. The nmap command that gave us all the information we needed was:

nmap -sV -O -sS -T4 -p 1-65535 -v

This gave us the following result set after a really really long run time:

23/tcp   filtered telnet
25/tcp   open     smtp?
80/tcp   open     http          This is not* a web server, look for ssh banner
81/tcp   open     http          nginx 1.4.6 (Ubuntu)
82/tcp   open     http          nginx 1.4.6 (Ubuntu)
137/tcp  filtered netbios-ns
138/tcp  filtered netbios-dgm
139/tcp  filtered netbios-ssn
445/tcp  filtered microsoft-ds
497/tcp  filtered retrospect
1024/tcp open     kdm?
1720/tcp open     h323q931?
2220/tcp open     ssh           OpenSSH 6.6.1p1 Ubuntu 2ubuntu2.8 (Ubuntu Linux; protocol 2.0)
2376/tcp open     ssl/docker?
3380/tcp open     sns-channels?
3389/tcp open     ms-wbt-server xrdp
5060/tcp filtered sip
5554/tcp filtered sgi-esphttp
8000/tcp open     http          nginx 1.4.6 (Ubuntu)
8080/tcp open     http          Jetty 9.4.z-SNAPSHOT
8086/tcp open     http          nginx 1.10.3 (Ubuntu)
9090/tcp open     http          Transmission BitTorrent management httpd (unauthorized)
9996/tcp filtered palace-5
19733/tcp filtered unknown
25222/tcp filtered unknown
30316/tcp filtered unknown
33389/tcp open     ms-wbt-server xrdp
33465/tcp filtered unknown
34532/tcp filtered unknown
35761/tcp filtered unknown
35812/tcp filtered unknown
35951/tcp filtered unknown
37679/tcp filtered unknown
38289/tcp filtered unknown
38405/tcp filtered unknown
38995/tcp filtered unknown
40314/tcp filtered unknown
44194/tcp filtered unknown
47808/tcp filtered bacnet

For some reason the results from the nmap scan varied so we had to run the scan multiple times to get all the services on the host. This was possibility because the server was setup to make automated scanning more difficult.

Once we identified the port where the SSH server was running on (2220) we were able to connect to the port and that gave us the exact OS Details of the server. We did already know that the server was running Ubuntu along with the kernel version from the PHP Info dump but this gave us the exact version.

Discovering Site architecture:

Since we had to discover the URL to the members & admin area before we could attack it, we used dirb which is a Web Content Scanner to get the list ofall the public directories/files on the site. This gave us the URL’s to several interesting files and directories. One of the files identified by dirb was When we visited the link it gave us a list of other URL’s on the site of interest (we had to replace the hostname to including the members area ( and siteadmin (

After a long and fruitless effort to use SQL Injection on the siteadmin area we started to explore the other files/URL’s identified by dirb. This gave us a whole bunch of files/data that seem to be left over from other hackathons so we ignored them.

SQL Injection

The main site appeared to be vulnerable to SQL at the first glance because when we visit’ (note the trailing single quote) it reloads the page. This meant that we could write queries to it however since it didn’t display a true or false on the page a SQL injection wasn’t easily possible. (We could have tried a blind injection but that would require a lot of effort for a non-guaranteed result.

As we explored the remaining URL’s in sitemap.xml one of the links ( was interesting as it appeared to give a dump of data being read from the site DB. Opening the site while watching the Developer Toolbar for network traffic identified a URL that appeared to be vulnerable to SQL injection ( and once we tested the url we found that the variable id was indeed vulnerable to injection.

We used blind sql to gain access by executing true and false statements and see that it returns different results for true(displays ‘1’ on the webpage) and false (displays 0) . We checked whether a UNION query runs on the site which it did and using other queries we identified the DB backend to be a mysql database ( version). Then we found out the table name (members) which was an easy guess since the website had an add customer field. After identifying the number of columns in the table we got stuck because any statements to list the available tables or extract data were failing with an error about inconsistent column numbers.

Finally, we ran sqlmap which is an open source tool for automating SQL injection. It took us a few tries to get the software running because initially any attempt to scan the site was rejected with a 403 error message. Turns out that the connections were being rejected because the site didn’t like the useragent the software was sending by default and adding a flag to randomize the useragent resolved the permission denied issue.

Once the scan ran successfully we tried to get access to the MySQL usertable but that failed because the user we were authenticating as to the MySQL server didn’t have access to the table required.

sqlmap -u '' --random-agent -p id --passwords

So, then we tried getting an interactive shell and an OOB shell both of which failed. We finally ran the command to do a full dump of everything that the system allowed us to export using SQL injection via SQLMap. This included the DB schema, table schema’s and a dump of every table on the database server which the mysql user had access to. The command we used is the following:

sqlmap -u '' --random-agent -p id  --all --threads 3

This gave us a full dump of all the tables and the software was helpful enough to identify password hashes when they existed in the table and offered to attempt decryption as well. In this case the password was encrypted with a basic unsalted MD5 hash which was cracked quite easily. Giving us the password for the first two accounts in the database (admin & demo).

Looking at the rest of the entries in the users table we noticed that they all had funny values in the email address field, instead of a regular email address we had entries that looked like the following:

,,,"0000-00-00 00:00:[email protected]509a6f75849b",1

As we had no clue what this was about the first thing we attempted was to access the URL. This URL gave us a message that told us that the email addresses in the DB were obfuscated by CloudFlare to protect them from Bots. A quick Google search gave us a 21 line python script which we tweaked to convert all the hash to email address and passwords. (The code is listed below for reference)

#! /usr/bin/env python 
# -*- coding: utf-8 -*- 
# vim:fenc=utf-8 
# Copyright © 2016 xl7dev  
# Distributed under terms of the MIT license. 


import sys 
import re 
fp = sys.argv[1] 
def deCFEmail(): 
   r = int(fp[:2],16) 
   email = ''.join([chr(int(fp[i:i+2], 16) ^ r) for i in range(2, len(fp), 2)]) 
   print email 
if __name__ == "__main__":                                                                                                                                                                       

This gave us the email addresses and passwords for all the users on the site. Since the accounts appeared to be created by SQL injection a bunch of them didn’t have any passwords but the remaining were valid accounts for the most part and we verified a couple by logging in manually with the credentials.

OWASP TOP 10 Vulnerability

To find the vulnerabilities in the home page we tried various manual techniques at first but drew a blank so we decided to use the owasp-zap. This tool allows you to automatically scan for vulnerabilities in a given URL along with a whole other stuff.

At first the scan failed because of the same issue as earlier with the user-agent. This time we took a different approach to resolve the issue by configuring owasp-zap as a proxy server and configuring Firefox traffic to use this proxy server for all traffic. This gave us the site in the software and we were then able to trigger both an active scan and spider scan of the site.

This gave us detailed reports that highlighted various issues in the site which we submitted.

Redirecting HomePage

The redirection of the home page was quite simple. We tried inserting a customer name with javascript tags in it and were able to do so successfully. So we inserted the following into the DB and the system automatically redirected the page when the Customer list section was accessed.

Other Interesting Finds

The nmap scan told us that in addition to port 80 a web server was listening on ports 81, 82, 8000, 8080 and 8086.

Ports 82, 8000 and 8086 were running standard installs of nginx and we didn’t find much of interest at these ports even after we ran dirb on all of them. Port 8080 appeared to be running a proxy or a Jenkins instance.

Port 81 was the most interesting because it was running a nginx server that responded to any queries with a 403 error. When we tried accessing the site via the browser we got an error about corrupted content.

We were unable to identify what the purpose of this site was but it was interesting.

SSH Banner / PHP Shell

The webserver instance running on port 80 had the version set to the following text “This is not* a web server, look for ssh banner Server at Port 80” so we went back and investigated the SSH Banner from the ssh server on port 2220. The banner was encrypted and to decrypt the SSH banner, we continuously converted the cipherText from its hex value to ASCII value . It gave us the following results on each conversion


 37333733363832303632363136653665363537323230363636663732373736313732363432303733366336313733363832303633366637353663363432303663363536313634796f75to a #

ssh banner forward slash could lead you to a #sh3ll.php

Once we got the full decrypted text we knew that there was a potential webshell on the server but it wasn’t apparent where the shell was located. After hit and try failed we turned back to our old faithful dirb to see if it could find the shell.

dirb allows us to specify a custom word list which is used to iterate through the paths and we can also append an extension to each of the words to search for, so we created a file called test with the following content:

suramya@gallifrey:~$ cat test 

and then ran the following command:

suramya@gallifrey:~$ dirb test  -X '.php'

This gave us the location of the shell.

Accessing the link gave us a page with a message “you found a shell, try pinging google via sh3ll.php?exec=ping”

Accessing the URL with the additional parameter gave us a page with the following output:

February 5, 2018

Is it a good idea to stop reading news?

Filed under: My Thoughts — Suramya @ 5:40 PM

Earlier today I was browsing the web and ended up on this HackerNews Thread where one of the users had posted the following comment:

I have recently stopped reading any kind of news. As a result I find that my mind is lot less cluttered. I have realized that once you give it up, you don’t really miss it a lot.

This made me think and I was wondering what the benefits are if we stop reading the news and what the downsides are of the same.

A little while ago a lot of the news items from around the world were pretty depressing and I found that if I read my news feed first thing in the morning as I normally did I ended up feeling a bit out of sorts for a while. Not depressed per se but with more of a bleah attitude for a while in the morning. After I figured this out I stopped reading general news first thing in the morning as I figured the issue was caused due to the fact that I was reading the news while half asleep when a lot of my brain was still struggling to wake up making it harder for my usual snark from kicking in. Instead of reading all news first thing in the morning I switched to reading only the tech news feeds early in the morning and then catch up with the world news later in the day (usually in the evening on the way back home). I found that this worked best for me for a while, but after a bit I changed my reading habits again and now I read the news (both tech and general) on the way to work and am fine with it. Plus another good development is that I get out of the house sooner if I am not cocooned in bed catching up with the news. 🙂

So, is it a good idea to stop reading any news? I don’t think so even after my experience. Knowing what is going on in the world is important and shutting yourself off from the world is not an answer. There are a lot of issues in the world and the first step in fixing them is to know about the issues. I mean if you don’t even know a problem exists then how are you going to even think about a solution for it? There is a quote from Isacc Asimov that seems relevant here:

“Your assumptions are your windows on the world. Scrub them off every once in a while, or the light won’t come in.”
― Isaac Asimov

So the question becomes, how do I scrub my windows to the world? The answer is quite simple, read about what is happening in the world. There might be new discoveries, events etc happening that will challenge your thinking and maybe result in a complete change in your thought process. Don’t get put down by the constant negative news in the media. The fact is that it’s not all bad out there and there are good things happening all over the world but that doesn’t sell so the media focuses on the negative aspects to sell paper (or user views etc). Bill Gates wrote about this recently as well. In a recent study folks took 15 different measures of progress (like quality of life, knowledge, and safety) and found that the world is actually getting better inspite of the mess we keep seeing in the news all the time.

All that being said it is quite possible that you end up getting down/depressed after reading & watching so much negative news in the press. This is a normal reaction. John Scalzi who is one of my favorite authors had the following advise on how to deal with this scenario (It was published about a year ago but is still valid):

3. Disconnect (temporarily). Especially now, it might be useful for a “hard reset”: taking a week (or two! Or more!) away from most news and social media in order to give your brain the equivalent of a few deep, cleansing breaths and the ability to switch focus away from the outside world and back into your internal creative life.

It’s often hard to do this — social media in particular is specifically designed to make you feel like if you’re not constantly attached to it then you’re missing something important. But here’s the thing: Even if it were true (which it usually is not), there are millions of other people out there to deal with it while you take a week off from the world to get your head right. Let them.

What are your thoughts about this topic? Do you feel that stopping to read news is a good idea? Let me know via comments below (or via email).

This is all for now. Will post more later.

– Suramya

February 1, 2018

Viewed the Lunar Eclipse + Blue Moon + Super Moon last night and it was awesome!

Filed under: My Life — Suramya @ 12:04 PM

As I mentioned in my previous post, yesterday was the Lunar Eclipse + Blue Moon + Super Moon combination that last happened 150 years ago. Initially I wasn’t sure that I would be able to make it home in time to view the eclipse but things worked out in the end and I was able to make it. To make things more fun a few friends who knew that I have a telescope invited themselves over (after asking me 🙂 ) so we had a mini get together/eclipse watching session up on the roof. The eclipse started being visible in Bangalore at 6:21pm but thanks the building around my place (and the bright lights at Leela Palace) I couldn’t really see the moon till almost 6:45pm. Most of the people arrived at my place by 7 so we went upstairs at 7 and were there till the end of the eclipse at about 8:30pm. A lot of other folks from DD were also there on the roof but we were the only ones with a telescope so got a lot of envious looks 😉

At first it was hard to get the moon in focus during the eclipse as it was very dim but after a few failed attempts we managed to get it in focus which allowed us (Anirudh) to take the pic of the moon through the telescope. (See below)

Full Lunar Eclipse (PC: Anirudh)

Near the end of the total eclipse. (PC: Anirudh)

About half way through the end of the partial eclipse (PC: Anirudh)

The banner at the bottom of the pic was added by Anirudh so even though I personally feel that it is ugly 🙂 I decided to keep in so that the credit is properly given.

Anirudh checking out the eclipse with Ananya, Josefine and Priyank waiting for their turn at the telescope.

Once the eclipse was over we all went back down to my place to hang out for a bit. Some of the folks had to leave early because of other commitments (and because we had work the next day) but Anirudh, Sharukh, Jani and me were up till almost 1am talking about all sorts of random stuff from Computer security to feminism.

Group pic at my place

Overall it was a fun evening. If I had known for sure that I would make it home for the eclipse I would have asked more friends to come over but since that wasn’t the case I wasn’t able to… But there is always the next time.

This is all for now. Will post more later.

– Suramya

January 30, 2018

Lunar Eclipse + Blue Moon + Super Moon happening together tomorrow!

Filed under: My Life — Suramya @ 11:41 PM

The first eclipse of 2018 will be a full lunar eclipse happening tomorrow (31st Jan). To makes things more interesting this is also a Blue moon and a Super Moon. Such an event hasn’t happened for more than 150 years. The next time a Blue Moon passes through Earth’s umbra will be on 31st Dec 2028, and after that on 31st Jan 2037. Both of these eclipses will be total as well.

The timings are not the most convenient for me as I will still be at work during the full eclipse unless I leave early. In India the eclipse will follow the following timescale:

18:21 Wed, 31 Jan
Total Eclipse begins Total moon eclipse starts – completely red moon. Moon close to horizon, so make sure you have free sight to East-northeast.

18:59 Wed, 31 Jan
Maximum Eclipse Moon is closest to the center of the shadow.

19:37 Wed, 31 Jan
Total Eclipse ends Total moon eclipse ends.

20:41 Wed, 31 Jan
Partial Eclipse ends Partial moon eclipse ends.

I wonder if I can carry the telescope to the Office. 🙂 If that is not possible then I just might leave early from work and log back on later in the night. Hopefully I won’t have any late evening face to face meetings tomorrow.

The last eclipse I had folks over at my place for a get together for moon-watching and if this was over the weekend or later in the day then I would have done the same again. But…

Will try to take some good pics and share them. This is all for now. Will post more later.

– Suramya

January 29, 2018

How can we secure a Client App so that the server side can detect tampering?

Filed under: Computer Security,My Thoughts — Suramya @ 5:09 PM

If you have been following ADHAAR in the News/Social Media recently then you must have seen the posts by some prominent cyber security folks about basic security issues with Adhaar. I couldn’t resist chiming in with my two cents and pretty soon the conversation switched from the glaring security issues with Adhaar to how we could secure applications when the client could not be trusted. Sushil Kambampati had some interesting questions on this topic and we tried having a discussion on Twitter itself for a short while but since twitter is not the best medium for long winded conversations we switched to email pretty soon and the following is a summary/expansion of my conversation with him.

Special thanks to Sushil for asking the two questions listed below thereby motivating me to write this post. Please note that all the items below are my personal thoughts and I don’t claim to know everything so some of the things below might not be the best option or might require additional safeguards beside the ones I talk about.

What are the risks if the client has been modified by an attacker?

The possibilities are endless if an app has been modified and can still successfully communicate to the server backend. The attackers can tamper with it to install a backdoor on an app, re-sign it and publish the malicious version to third-party app marketplaces. They can also change the app to query the server in ways that the designer didn’t expect. e.g. query the DB for all possibly values of the Adhaar no (as an example) to identify valid values. They can also attempt to perform SQL injection attacks/other attacks on the server by sending it data that it doesn’t expect.

How can the server-code detect whether the client app has been modified?

This is a very interesting problem and there is no foolproof method to ensure that the local client hasn’t been modified. However that said we can always make it harder for the attacker to modify the app. Some ways we can detect tampering are listed below along with potential ways to bypass the checks. (I am going to talk about app side checks in addition to server side since both need to be performed to secure the app). I specifically talk about Android applications here but the same is valid for any server/client system where the client can’t necessarily be trusted (and if your client is installed on a machine you don’t control then it def can’t be trusted).

  • We add code obfuscation/shrink the code using Proguard.This makes it more difficult (though certainly not impossible) to reverse engineer the code by making it harder to read a stack trace because the method names are obfuscated. Other things we can do to harden the app is to include checks to detect if the app is running in a virtual environment (emulator) and abort runs. This check should not be an easy thing to disable e.g. by setting a flag, instead the build process should add the check when building the release version or something similar while making it as hard as possible to disable. Finally we should ensure that all debug code is stripped out from the build when creating the release version. This will make it harder for the attacker.

    The communication between Server & Client should be over a secure/encrypted channel (use HTTPS not HTTP), all local data should be encrypted with a unique password that is generated at runtime (1st run) using a random seed.

  • We have the app send a checksum that the server verifies everytime an API call is made.
  • This is a very basic check that is fairly simple to bypass as any competent attacker will also modify the app to send the correct checksum value even though the actual checksum value is different.

  • Have the Server request for a byte string from a random location in the APP e.g. send me 100 bytes starting from byte # 2000 from the beginning of the file. This check would fail if any changes are made to the file in the section that the check queried.
  • The issue is that there is a high probability that the check location requested by the server is not for the location that the attacker has modified. Also, if the attacker is sufficently motivated they can append a copy of the original App to the tampered app and then modify the check function to return the values from the original app when the server attempts to verify the integrity.

  • Verifying your app’s signing certificate at runtime.
  • All applications in the Appstore are signed with a developers private key and the app signature will be broken if the APK is modified. By default android will not allow you to install an app where the signature doesn’t match. However you can potentially bypass it by changing the code / value you are checking against. Also, the app can still be installed manually if the phone is rooted.

  • Verifying the installer
  • Each app contains the identifier of the app which installed it. Therefore, with a very simple check you could have your app verify the installer ID. This can be an in app check and also triggered by a server API call. However with access to the code (by reverse engineering the app) this check could potentially be commented out.

  • Monitor your server side logs
  • This is very important, because any attempts to hack the server/bypass restrictions will leave a trace in your logs. If you have configured good log monitoring rules then this can act as an indicator of someone trying to hack your application. Then you have the option of putting countermeasures into action like blacklisting etc.

Hope this all makes sense. Please let me know if you have any further questions by posting a comment below or emailing me.



December 30, 2017

Checking out the Classic Coffee trail

Filed under: My Life — Suramya @ 11:24 PM

A few weekend’s ago I got the opportunity to go for ‘The Classic Coffee Trail’ organized by the wonderful folks over at ‘Food Lovers’. Now some of you might be wondering who someone who doesn’t really like drinking coffee would go for an event about coffee. I went for two main reasons:

* I love to learn and this looked like a great opportunity to learn about coffee and expand my horizons
* It was highly recommended by Sachin and Diana who were also going to the event.

We started the trip by waking up at 4:30am on Saturday so that we could leave by 5:30am. This was especially painful as I had an early morning on Friday as well (woke up at 3am) as I had a day long work trip to Chennai. But sleep is no match for Red bull followed by a shower. So by the time Sachin and Diana were at my place I was wide awake and ready to roll. The drive down to Sakleshpur was pretty scenic and uneventful for the most part. Sachin and Diana alternated driving while me and Jani relaxed in the back. At first I managed to stay awake but after a point the red bull effect wore off and apparently I had a full conversation with Sachin while being about 90% asleep. I am not sure what is more scary, the fact that I could have a full intelligent conversation while sleeping or that Sachin was sleepy enough that he didn’t realize that I was pretty much talking in my sleep.

We made good time and got to the Harley Estate by 11 or so where we were met by Kripal from Food Lovers who had organized the event. I must say that Kripal and his team made every effort to make our trip smooth and as enjoyable as possible starting with the phone calls before the trip to ensure we knew the route and continuing during the event with the attention to detail. While waiting for the rest of the attendees to arrive we got to meet Chandini D. Maneesh and Tapaswini Purnesh who were our hosts for the weekend. Both sisters are very friendly & knowledgeable about coffee. Even though I am not a coffee drinker I was tempted to start after listening to the both of them talking about Coffee with such energy and enthusiasm.

We were staying in ‘Whispering Tree’ that was a good 10 mins jeep ride from the entrance of the estate. The cottage is nestled in the forest and at night some of the folks heard animals in the forest and the local farmers had to constantly make noise to prevent wild boar from destroying their crops. Right outside the cottage there was a hammock which looked very inviting so I immediately laid claim to it before anyone else could beat me to it. After a few minutes of relaxation I suddenly found out why the hammock was so conveniently empty. Turns out that there was a nest of fire ants on one of the trees to which the hammock was tied and they loved the opportunity to take a bite out of me. Infact they took several bites before I managed to get them all off me. But I did get revenge on the ants finally as there is a local dish that is made out of ground ants and it was served to us the next day for breakfast. Even though I didn’t eat it it gave me great pleasure to watch others eating the ants. 🙂

Jeep Ride to the cottage

Relaxing on the non-ant infested hammock

After we relaxed for a bit it was time for Lunch which was served in a lush meadow with a stream running alongside it. A small waterfall a few hundred feet away was tempting me to go for a swim but since we had all neglected to bring swimsuits we had to make do with just putting our feet in the water which was nice and cold. In all I thought that this place would be amazing to camp in and I was pleased to find out that they do allow folks to pitch tents in the area and have a camp out. The food was very tasty and went very well with the lovely wine we had. All too soon the lunch was over and we started the coffee tour, which started with a coffee tasting where we tasted the same coffee prepared using 3 different methods and it was unbelievable how much difference the brewing method makes in the final taste of the drink. I learned more about coffee brewing in one hour than I had in the past 36 years of my life.

Enjoying cooling down in a cold stream of water

Lunch at the Gazebo

Group Photo at the beginning of the tour (after lunch) [PC: Food Lovers]

Once the tasting was done we walked over to the main bean processing unit where the day’s collection of beans was ready to be processed. Each bean picker weighed their day’s pickings and then dumped them in this pit that led to a gigantic pulping machine to separate the skin and pulp from the bean. After the skin and pulp were separated they are separated by weight by passing them through water channels where the lighter beans float and the heaver ones sink. They are then passed through a series of rotating drums to separate the beans by size after than they are dried by spreading them on drying tables for a few days and then finally sun dried on the floor. To prevent spoilage the beans are turned regularly by big rakes and at evening just as the sun goes down they are collected into a massive pile and covered with tarp to avoid dew. This is required because it gets pretty cold at night and there is dew everywhere which is quite bad for coffee beans.

After the tour, we sat around and chatted for a little while enjoying some great snacks and coffee (obviously). Once everyone was done we went back to our cottages to rest and prepare for dinner. Dinner started with a bonfire, barbecue and some great wine. (It wasn’t just me who said that, Sachin liked it as well and he’s the expert 🙂 ). The food was simple but quite tasty. We spent a couple of hours just talking about random stuff and enjoyed being outsite without worrying about work/pollution etc. Though some of the folks were worried about the night life coming to say hi to us. Fortunately nothing decided to come say hello and all too soon it was time for us to call it a night so we crashed.

Group Pic at Dinner [PC: Food Lovers]

The next day started really early as we were scheduled for a plantation walk. I seriously considered sleeping in and letting others go on without me but I am glad I didn’t since the walk was a lot of fun. We got to watch the plantation workers pluck the berries and tasted the raw berry which was surprisingly sweet. Since the Harley Estate follows the selective harvesting method they only pick the ripe cherries from the trees which as you can imagine is a pretty labor intensive process. The walk was not very taxing and since it wasn’t rainy season we even managed to avoid leeches which was a big plus (I am not a fan but Jani finds them to be ‘sexy’ for some reason).

Learning about Coffee picking

Post the walk we were all pretty hungry and fell upon the breakfast like starving people. Me and Jani strategically seated ourselves so that the servers had to pass us everytime they came up with food so we had the first pick of the food. 🙂 This ensured that we were done with breakfast while others were still waiting for food and we spent the time after we were done stuffing our faces just sitting in the sun relaxing.

Look at the pretty flowers (and us)

Enjoying early morning coffee in the sun

Finally everyone was done with breakfast and it was time for us to bid farewell to each other and with a heavy heart we packed up our bags and started the long drive back. The drive back was mostly uneventful and we reached Bangalore without any issues and promptly got stuck in traffic.

If you ever want to combine a relaxed trip with learning about coffee I highly recommend this place. Check it out if you have the time.

Well this is all for now. Will write more later.

– Suramya

December 14, 2017

My primary desktop is dead

Filed under: Computer Hardware,My Life,Techie Stuff — Suramya @ 12:00 AM

The fan on my computer was giving me some problem (it sounded like an aircraft taking off) so I thought I’ll replace it with a new one. The new fan/heatsink arrived earlier this week and today I finally had the time to try installing it.

First I had to remove the old fan and heat sink so I looked at the video online on how to remove the fan and followed the instructions exactly. But unfortunately the Glue used to stick the CPU to the heat sink was a little too strong and while I was trying to remove the heat sink I managed to get the top half the CPU separated from the bottom half. So now my computer is a very expensive paper weight still I get a new CPU. 

Looking online I found that the AM3+ socket that my motherboard uses has been phased out and even though there are processors that will work with it for the same cost or slightly cheaper I can get a new more powerful CPU and motherboard. So obviously I am going for the latter option.

I have selected a new board and CPU on Amazon but didn’t order it yet because I want to check the cost at some of the local shops before I order. Plus the Amazon order will take a couple of days to get here and I want to avoid the delay.

Its not that I don’t have a machine right now as I have two laptops but the desktop was my primary machine configured to be exactly the way I like it and it’s annoying to have to use a laptop which is not configured exactly the same as the desktop. 

I am dictating this blog post on my phone using the Google voice type and it’s about 99% accurate which is pretty cool. I still can’t figure out how to put punctuation during the dictation but other than that it works perfectly.

Well this is all for now will write more later hopefully on my new desktop.

– Suramya

December 6, 2017

How to ensure that your friend request is never accepted

Filed under: My Life — Suramya @ 11:59 PM

Those of you who know me know that I don’t log on to facebook very often. At max I will check it every few months or when someone asks me to log on to check something specific (or when I have to accept an event invite which is only on FB). So its quite common for folks to send me a friend request that sits in my inbox for a few months at a time with no response.

Yesterday I was on FB to reply to all the birthday messages and I happened to check the messenger messages from people not on my friend list. That was where I found this gem. This is from a person (lets call him Mr P) who I met during a trek and he apparently sent me a friend request after the trek and since I log on so infrequently I didn’t see the request. Instead of sending me a polite reminder as some of the other folks have (both on FB and via whatsapp) he sent me the below message. When I first read this I actually seriously considered posting the screenshot here without blurring out his name but in the end decided that just because he is acting like an idiot, it doesn’t mean that I should as well.

After reading the message a few things are quite clear, there is no way I am ever adding him as a friend on any social media. But let me go ahead and answer his questions anyways. His questions are in italics while my answers are in normal font:

>i would say f off moron..
Ok, that’s a lovely way to start a conversation. I am now eager to talk to you. I mean how can I not, when I have waited so long to be properly insulted.

>who wouldn’t accepted my frient request…
Umm… I would hazard a guess and say pretty much everyone who got a message like this from you. If I was on the fence about adding you as a friend this would ensure that I don’t add you. There is a saying in hindi: Koyle ki dalali me muh kala (If you go in the coal mine you will end up with a black face) and the moral is that you are who you associate with. If you have a friend who is an ass you will come across as an ass as well in the worst case and as someone who agrees with their behavior in the best case. So if I add you as a friend I will seem to condone your behavior which is something I am def not doing.

>Are u a billioner…
Not yet, but working towards it. 🙂 why? Are you planning on asking for a loan?

>F u honey
The closest I have had someone call me honey till now is a friend who told me that I was like a grumpy bear in the morning till I woke up fully. But in anycase, thanks but no thanks. You are not my type 😉

I will no longer be interacting with him if we ever meet in person on a trek or elsewhere because he has confirmed my opinion of him as an immature idiot.

Have you received any such requests/messages when you don’t accept / respond to a friend request? If so how do you react? Other than posting this blog entry I have not replied to his message/interacted with him at all.

– Suramya

December 5, 2017

Dominos Pizza online has stronger password requirements than Citibank India Online

Filed under: Computer Related,My Thoughts,Techie Stuff — Suramya @ 11:59 PM

Today I decided to change my IPIN (Internet Pin) on Citibank as I haven’t changed it in a while and its a good idea to change it on a regular basis. So I logged in to my account and clicked on the password reset link and I got the following text:

The first item there is fairly standard but what really surprised me were items # 3,4 & 6. What do you mean I can’t have any special characters in my password? Why can’t I have a password longer than 16 Characters when the NIST password guidelines recommend that you allow a password of up to 64 char’s in length.

In contrast The Dominos Pizza’s Online portal has stronger security and requires you to have Upper case, Lower Case, Numeric Char and a Special Character in the password. Making it a lot more secure and harder to crack than the Citibank password.

This is not all. The best part is yet to come. I use a password manager and my generated password was 22 characters long this time, so I pasted it into the form and the system accepted the password change. Now since I am a paranoid person I decided to check if the password changed successfully by logging in with the new password. Imagine my surprise when an error message popped up on screen when I tried to log in telling me that my password can’t be longer than 16 chars. I was confused since the password change form took my 22 char password without trouble, so I tried logging in with the old password and that obviously didn’t work. Finally I tried removing the extra 6 characters from my password and was able to log in.

Basically the stupid system truncated my password to 16 and then saved it instead of warning me that my password was too long when I was changing the password which would have been the logical thing to do.

Citibank needs to update its system to follow the NIST rules and start allowing people to choose more secure passwords.

Well this is all for now, will write more later.

– Suramya

Older Posts »

Powered by WordPress