Suramya's Blog : Welcome to my crazy life…

October 5, 2022

3D Scanning was used over 160 years ago to create photosculptures

Filed under: Interesting Sites,My Thoughts,Tech Related — Suramya @ 1:32 PM

When we talk about 3D scanning we all assume it is one of the emerging technologies and with the recent advances it has been growing more and more popular. A usecase that is becoming popular is to scan a sculpture or art installations so that the scans are published online and can be converted to VR or used to 3D print an exact replica. For example, The State Darwin Museum in Europe has been slowly digitizing / 3D scanning its collection. Other museums have been doing the same as well.

But interestingly, this is not a new technology and it was in use over 160 years ago to create what is known as photosculptures. A recent article on Hackaday.com talks about how in the late 19th century (1861) the art of creating realistic, 3-dimensional replicas using a series of 24 photos that were combined to create a 3D image was extremely popular. This process was called photosculpture and was invented by François Willème, a French painter, photographer and sculptor.


Example of a photosculpture created using this technique. (PC: University of Virginia: Department of Art)

He perfected the art of taking photos from 24 camera’s in a circle with the subject standing in the middle, synchronizing them to create a 3D model that could be projected on a screen. Then a pantograph was used to cut the layers of the picture into thin sheets of wood. The artist would then assemble the cuttings to create a rough 3D replica of the object. Once the base was created they would fill in the details using materials such as bronze, plaster of Paris and terra cotta to create a realistic result.


A visual overview of how Photosculptures were created

This whole process was a lot cheaper than having a sculpture created via the normal process and a lot faster so it became quite popular for a while with the public. But with other competitors patenting their own versions and the demand reducing he had to shutdown the studio by late 1868. Check out the following article for more details on the process More than 100 Years before 3D Printers, We had Photosculpture which is quite fascinating.

It made me think that we have this unspoken assumption that the previous generations were not as smart/advanced as we were and only in the modern world we have these amazing breakthroughs that wouldn’t have been possible earlier and then you read about these inventions and techniques that were there hundreds of years ago that does the same thing (albeit a bit more crudely) as our modern cutting edge technologies. There was a lot of scientific advances done historically that were lost due to various reasons and sometimes I dream about how the world would have been if we had not lost the Library of Alexandria or the Nalanda University which were amongst the many institutes destroyed by invaders and their staff & students slaughtered. Imagine how many advances were lost, how much wisdom was lost over the years due to this…

– Suramya

October 4, 2022

Workaround for VPN Unlimited connection issues with latest Debian

VPN’s are a great way to ensure that your communication remains private when using a pubic internet connection such as when you are connected to an Airport or Coffee shop Wifi. Plus they are good for getting access when a site is blocked where you are, for example in India VideoLan.org the main site for VLC Media player has been blocked for a while. I primarily use VPN Unlimited on all my systems as I have a lifetime subscription though I also have other VPN’s that I use sometimes.

Unfortunately, the native VPN Unlimited application for Linux has stopped working a while ago due to a compatibility issue with SSL. When I upgraded to the latest version of Debian back in July 2022 it suddenly stopped working with the following error message:

vpn-unlimited: symbol lookup error: /lib/libvpnu_private_sdk.so.1: undefined symbol: EVP_CIPHER_block_size

Reinstalling the software didn’t resolve the issue and neither did a search on the internet help. When I reached out to support they told me that Debian 11 wasn’t yet supported and they didn’t have an ETA for the new version to be released. They did recommend that I manually create & download an openvpn config from their site that would allow me to connect to the VPN manually using OpenVPN instead of the App. Unfortunately, the config generated didn’t work either as it would fail to connect with the following error message in the logs:

Sep 21 02:56:55 StarKnight NetworkManager[1123]:  [1663709215.0845]vpn[0x559d7fc46900,833a72d8-a08a-474e-a854-c926cd6c694a,"VPN Unlimited"]: starting openvpn
Sep 21 02:56:55 StarKnight NetworkManager[1123]:  [1663709215.0847] audit: op="connection-activate" uuid="833a72d8-a08a-474e-a854-c926cd6c694a" name="VPN Unlimited" pid=2829 uid=1000 result="success"
Sep 21 02:56:55 StarKnight kded5[2780]: org.kde.plasma.nm.kded: Unhandled VPN connection state change: 2
Sep 21 02:56:55 StarKnight kded5[2780]: org.kde.plasma.nm.kded: Unhandled VPN connection state change: 3
Sep 21 02:56:55 StarKnight NetworkManager[233850]: 2022-09-21 02:56:55 WARNING: Compression for receiving enabled. Compression has been used in the past to break encryption. Sent packets are not compressed unless
"allow-compression yes" is also set.
Sep 21 02:56:55 StarKnight nm-openvpn[233850]: DEPRECATED OPTION: --cipher set to 'AES-256-CBC' but missing in --data-ciphers (AES-256-GCM:AES-128-GCM:CHACHA20-POLY1305). OpenVPN ignores --cipher for cipher negotiations.
Sep 21 02:56:55 StarKnight nm-openvpn[233850]: OpenVPN 2.6_git x86_64-pc-linux-gnu [SSL (OpenSSL)] [LZO] [LZ4] [EPOLL] [PKCS11] [MH/PKTINFO] [AEAD] [DCO]
Sep 21 02:56:55 StarKnight nm-openvpn[233850]: library versions: OpenSSL 3.0.5 5 Jul 2022, LZO 2.10
Sep 21 02:56:55 StarKnight nm-openvpn[233850]: WARNING: No server certificate verification method has been enabled. See http://openvpn.net/howto.html#mitm for more info.
Sep 21 02:56:55 StarKnight nm-openvpn[233850]: NOTE: the current --script-security setting may allow this configuration to call user-defined scripts
Sep 21 02:56:55 StarKnight nm-openvpn[233850]: OpenSSL: error:0A00018E:SSL routines::ca md too weak
Sep 21 02:56:55 StarKnight nm-openvpn[233850]: Cannot load certificate file /home/suramya/.local/share/networkmanagement/certificates/E87E7A7D6DA16A89C7B4565273D3A792_hk_openvpn/cert.crt
Sep 21 02:56:55 StarKnight nm-openvpn[233850]: Exiting due to fatal error
Sep 21 02:56:55 StarKnight NetworkManager[1123]:  [1663709215.1095] vpn[0x559d7fc46900,833a72d8-a08a-474e-a854-c926cd6c694a,"VPN Unlimited"]: dbus: failure: connect-failed (1)
Sep 21 02:56:55 StarKnight NetworkManager[1123]:  [1663709215.1095] vpn[0x559d7fc46900,833a72d8-a08a-474e-a854-c926cd6c694a,"VPN Unlimited"]: dbus: failure: connect-failed (1)

After a little more back and forth with the support team (which was extremely responsive and quick) which in turn reached out to their developers we identified the issue with the OpenVPN config. The fix for this will be deployed to all their servers by the end of this month. In the mean time I was given a workaround that resolved the issue for me. To fix the issue add this line to your OVPN file under the VPN section:

tls-cipher=DEFAULT:@SECLEVEL=0 

More information on this is available in the OpenVPN forum. Keep in mind that this is not a really secure configuration and if you are working on something really top secret you should use another VPN till the issue is actually fixed instead of this workaround as it is not secure.

However, just wanted to share this here for others who might be having this same issue. Hope this helps.

– Suramya

October 3, 2022

Debian to allow non-free firmware in its default installer

Filed under: Linux/Unix Related,My Thoughts,Tech Related — Suramya @ 10:19 AM

One of the problems preventing new users from using Debian is that if your hardware is not supported by an Open (‘free’) driver/firmware then the system doesn’t install any and then it is a painful process to download and install the driver, especially if it is for the Wireless card. In earlier laptops you could always connect via a network cable to install the drivers but the newer systems don’t come with a LAN connection (which I think sucks BTW) so installing Debian on those systems is a pain.

Debian leadership has been debating on how to fix this over the past few months and there was a vote to decide Debian would handle non-free firmware going forward. Now the voting has completed and the verdict is in, Debian has decided that the Debian Official Installer Media can include firmware that is otherwise not part of the Debian system. The non-free firmware would be automatically installed and activated when the installer determines that it is needed for the OS to function. The setup would notify the user in such cases and provide instructions on how to disable the changes if required.

The Debian Project also makes the following statement:

We will include non-free firmware packages from the “non-free-firmware” section of the Debian archive on our official media (installer images and live images). The included firmware binaries will normally be enabled by default where the system determines that they are required, but where possible we will include ways for users to disable this at boot (boot menu option, kernel command line etc.).

When the installer/live system is running we will provide information to the user about what firmware has been loaded (both free and non-free), and we will also store that information on the target system such that users will be able to find it later. Where non-free firmware is found to be necessary, the target system will also be configured to use the non-free-firmware component by default in the apt sources.list file. Our users should receive security updates and important fixes to firmware binaries just like any other installed software.

We will publish these images as official Debian media, replacing the current media sets that do not include non-free firmware packages.

This is a great choice and will allow the installer to work pretty seamlessly for most users. I know there are purists who will be shouting and screaming that this is not the ‘true way for free software‘ but they will be a minority for the most part. Installers need to be simple, while allowing power users more granular control of the process. This change removes a major barrier in the adoption of Debian and makes the lives of millions of system administrators a lot easier.

Source: Slashdot: Debian Considers Changing How It Handles Non-Free Firmware
More details at: Debian Choose A Reasonable, Common Sense Solution To Dealing With Non-Free Firmware

– Suramya

October 2, 2022

Upgrading Debian Unstable – How to avoid obvious problems

Filed under: Computer Tips,Linux/Unix Related,Tech Related — Suramya @ 11:59 PM

If you are using Debian Unstable there is a possibility that your system might not work correctly after an upgrade, because as the name states it is an ‘unstable’ distribution that might have bugs. I use it because Debian Stable has older versions of the software available and I want to the latest versions if possible. Plus I don’t mind tinkering with the system if things break so that helps as well. Over the years I have found some easy ways to prevent the most obvious problems when upgrading and I will share them here.

First option is to upgrade the system regularly. You can decide what the frequency of the upgrade is but do it regularly. I upgrade twice a month and that ensures that the system has the latest updates and we are not so far out of sync that we need to download a ton of files for the upgrade. This is very useful when you don’t have much free space available in the root partition as the longer you wait the more files you have to download and the less free space we have for the actual upgrade.

Another thing I do that has helped me a lot is to ensure that you look at the packages being upgraded, specifically any packages being removed. Don’t upgrade if there are a lot of packages being removed without updated versions being installed. To give an example, I tried upgrading my system yesterday and it told me that it was going to “457 upgraded, 11 newly installed, 297 to remove and 0 not upgraded.” Looking at the packages it was going to remove I found that if I had blindly allowed the upgrade to proceed it would have ended up uninstalling my entire KDE install, VPN server and a whole bunch of other stuff. I waited for a day and tried again and the bug that was causing the system to insist on removing KDE during the upgrade was resolved and I was able to upgrade successfully.

I also pipe the output from the apt-get dist-upgrade command to a log file so that I have a log of what was changed and any errors are logged so I can look at it later in case there are issue. The command I use for that is as below:

apt-get dist-upgrade 2>&1 |tee ~suramya/Documents/Suramya/Computer\ Update\ Logs/StarKnight/2022/10032022

I keep all the logs from the upgrades so I can see exactly what was changed on the system and when. Makes it a lot easier to troubleshoot issues caused due to an upgrade.

If you have multiple systems, then I recommend you don’t upgrade all of them at the same time. I stagger them by a day or two so that in case of issues I have a working system. This has saved my sanity a few times.

Well, this is all for now. Do share any tips you might have for avoiding issues during an upgrade.

– Suramya

September 25, 2022

How is everyone ok that Windows is showing advertisements everywhere in the system?

Filed under: Computer Software,My Thoughts,Tech Related — Suramya @ 11:55 PM

Linux is an Open Source operating system that is available for free while Windows is a paid OS that costs a fair bit of money (~$200 per license). One would think that because we are getting something for free when using Linux then we are the product. Strangely this is not the case and it is Windows that is showing me advertisements like I got it for free and even more strangely people seem to be ok with it.

My Linux setup has 0 ads on it that are pushed to it by the OS, Windows on the other hand seems to be determined to put advertisements where ever it can find some space. For example, you get ads in the Start Menu, the lock screen, Windows Explorer etc etc. If I am paying money for the OS I don’t want to have ads pushed to me that I can’t get rid of. I mean the folks over at How to Geek have a 14 page document explaining how to disable all the built-in advertising in Windows 10, which shows how strongly MS is trying to push advertisements on their platform.

Which is ridiculous, I mean I would complain about this much ads on a system that I didn’t pay for but apparently it is fine for a billion dollar company to waste my screen viewing estate, bandwidth and processor power to show me advertisements on a OS that I paid money for. If a system is showing me ads then they should be making the OS free so at least they have some excuse for the behavior, similar to what Netflix is doing where the plan with the advertisements in the programing is cheaper than the one without.

What do you think?

– Suramya

September 24, 2022

Keep your disk temperatures below 40 Deg C to increase their life

Filed under: My Thoughts,Tech Related — Suramya @ 12:18 PM

Over the past few decades since I got my first computer, Hard Drive failures have been a problem for me till recently as my disks would last about a max of 2 years before I started seeing errors & disk failure on them. I tried all the brands including Segate, WD and a couple of others but still had the same issue. It had gotten bad enough that I was looking at buying enterprise hard disks instead of the Desktop versions even though the enterprise versions are a lot more expensive.

Then one day I was randomly looking at the temperature sensors for the system, I noticed that the hard drive temperature for 2 of the disks was at 41 Deg C. Plus the logs on the disk showed that this was a common occurrence for the disks. Then a quick Google search told me that the drives should be kept below 40 Deg to avoid disk failures. So I opened up the case and added a couple of more fans in the casing so that I had a constant flow of air over the disks. Took me about 20 mins and I already had the extra fans lying around. With the new fans the disk temperatures dropped to an between of 33-35 Deg C and I left it as is.

This was about 5 years ago. Today I was running my quarterly SMART scan of all my disks and noticed that the disks have been running for an average of 50k hours now (one of the disks is at 2k, but all others have been running constantly for a while). The max value of lifetime hours for my system is currently at: 52381 hours -> 2182.5 days -> 5.97 hours. That is a massive improvement over the previous average of <2 years. I am sure the same would be true for laptops as well but it is difficult to add another fan to a laptop so haven’t tested it. Plus my laptop doesn’t get used as often as my desktop since I mainly only use it while traveling whereas the desktop has been running pretty much 24×7 since I got it.

This shows that having your CPU/devices at the recommended temperature is essential for a longer life of the components. This is one of the reasons that all data-centers are cooled to the degree they are and any increase in the temperature maintained needs to be carefully tested before implementation.

– Suramya

August 31, 2022

Thoughts around Coding with help and why that is not a bad thing

Filed under: Computer Software,My Thoughts,Tech Related — Suramya @ 11:40 PM

It is fairly common for the people who have been in the industry to complain about how the youngsters don’t know what they are doing and without all the fancy helpful gadgets/IDE’s they wouldn’t be able to do anything and how things were better the way the person doing the complaining does it because that is how they learnt how to do things! The rant below was posted to Hacker News a little while ago in response to an question about coPilot and I wanted to share some of my thoughts around it. But first, lets read the rant:

After decades of professional software development, it should be clear that code is a liability. The more you have, the worse things get. A tool that makes it easy to crank out a ton of it, is exactly the opposite of what we need.

If a coworker uses it, I will consider it an admission of incompetence. Simple as that.

I don’t use autoformat, because it gets things wrong constantly. E.g. taking two similar lines and wrapping one but not the other, because of 1 character length difference. Instead I explicitly line my code out by hand to emphasize structure.

I also hate 90% of default linter rules because they are pointless busywork designed to catch noob mistakes.

These tools keep devs stuck in local maxima of mediocrity. It’s like writing prose with a thesaurus on, and accepting every single suggestion blindly.

I coded for 20 years without them, why would I need them now? If you can’t even fathom coding without these crutches, and think this is somehow equivalent to coding in a bare notepad, you are proving my point.

Let’s break this gem down and take it line by line.

After decades of professional software development, it should be clear that code is a liability. The more you have, the worse things get. A tool that makes it easy to crank out a ton of it, is exactly the opposite of what we need.

If a coworker uses it, I will consider it an admission of incompetence. Simple as that.

This is a false premise. There are times where extra code is a liability but most of times the boiler-plate and error-checking etc is required. The languages today are more complex than what was there 20 years ago. I know because I have been coding for over 25 years now. It is easy to write Basic/C/C++ code in a notepad and run it, in fact even for C++ I used TurboC++ IDE to write code over 25 years ago… We didn’t have distributed micro-services 20 years ago and most applications were a simple server-client model. Now we have applications connecting in peer-to-peer model etc. Why would I spend time retyping code that a decent IDE would auto-populate when I could use that time to actually solve more interesting problems.

This is the kind of developer who would spend days reformating the code manually to look just right instead of coding the application to perform as per specifications.

I don’t use autoformat, because it gets things wrong constantly. E.g. taking two similar lines and wrapping one but not the other, because of 1 character length difference. Instead I explicitly line my code out by hand to emphasize structure.

This is a waste of time that could have been spent working on other projects. I honestly don’t care how the structure is as long as it is consistent and reasonably logical. I personally wouldn’t brag about spending time formatting each line just so but that is just me.

I also hate 90% of default linter rules because they are pointless busywork designed to catch noob mistakes.These tools keep devs stuck in local maxima of mediocrity. It’s like writing prose with a thesaurus on, and accepting every single suggestion blindly.

I am not a huge fan of linter but it is a good practice use this to catch basic mistakes. Why would I spend manual effort to find basic issues when a system can do it for me automatically?

I coded for 20 years without them, why would I need them now? If you can’t even fathom coding without these crutches, and think this is somehow equivalent to coding in a bare notepad, you are proving my point.

20 years ago we used dialup modem and didn’t have giga-bit network connections. We didn’t have mobile-phone/internet coverage all over the world. Things are changing. We need to change with them.

Why stop at coding with notepad/vi/emacs? You should move back to assembly because it allows you full control over the code and write it more elegantly without any ‘fluff’ or extra wasted code. Or even better start coding directly in binary. That will ensure really elegant and tight code. (/s)

I had to work with someone who felt similarly and it was a painful experience. They were used to of writing commands/code in Hex to make changes to the system which worked for the most part but wasn’t scalable because they didn’t have others who could do it as well as him and he didn’t want to teach others in too much detail because I guess it gave them job security. I was asked to come in and create a system that allowed users to make the same changes using a WebUI that was translated to Hex in the backend. It saved a ton of hours for the users because it was a lot faster and intutive. But this person fought it tooth and nail and did their best to get the project cancelled.

I am really tired of all these folks complaining about the new way of doing things, just because that is not how they did things. If things didn’t change and evolve over the years and new things didn’t come in then we would still be using punch cards or abacus for computing. 22 years ago, we had a T3 connection at my university and that was considered state of the art and gave us a blazing speed of up to 44.736 Mbps that was shared with the entire dorm. Right now, I have a 400Mbps dedicated connection that is just for my personal home use. Things improve over the years and we need to keep up-skilling ourselves as well. There are so many examples I can give about things that are possible now which weren’t possible back then… This sort of gatekeeping doesn’t serve any productive purpose and is just a way for people to control access to the ‘elite’ group and make them feel better about themselves even though they are not as skilled as the newer folks.

The caveat is that not all new things are good, we need to evaluate and decide. There are a bunch of things that I don’t like about the new systems because I prefer the old ways of doing things. It doesn’t mean that anyone using the new tools is not a good developer. For example, I still prefer using SVN instead of GIT because that is what I am comfortable with, GIT has its advantages and SVN has its advantages. It doesn’t mean that I get to tell people who are using GIT that they are not ‘worthy’ of being called a good developer.

I dare this person to write a chat-bot without any external library/IDE or create a peer-to-peer protocol to share data amongst multiple nodes simultaneously or any of the new protocols/applications in use today that didn’t exist 20 years ago

Just because you can’t learn new things doesn’t mean that others are inferior. That is your problem, not ours.

– Suramya

August 28, 2022

Debian looking at changing how it handles non-free firmware

Filed under: Computer Software,Linux/Unix Related,Tech Related — Suramya @ 5:38 PM

One of the major problems when installing Debian as a newbie is that if your hardware is not supported by an Open (‘free’) driver/firmware then the system doesn’t install any and then it is a painful process to download and install the driver, especially if it is for the Wireless card. In earlier laptops you could always connect via a network cable to install the drivers but the newer systems don’t come with a LAN connection (which I think sucks BTW) so installing Debian on those systems is a pain.

How this should be addressed is a question that has been debated for a while now. It was even one of the questions Jonathan Carter discussed in his post on ‘How is Debian doing’. There are a lot of people with really strong opinions on the topic and ‘adulterating’ Debian by allowing non-free drivers to be installed by default has a lot of people up in arms. After a lot of debate on how to resolve there are three proposals to solve this issue that are up for vote in September:

Proposal A and B both start with the same two paragraphs:
We will include non-free firmware packages from the “non-free-firmware” section of the Debian archive on our official media (installer images and live images). The included firmware binaries will normally be enabled by default where the system determines that they are required, but where possible we will include ways for users to disable this at boot (boot menu option, kernel command line etc.).

When the installer/live system is running we will provide information to the user about what firmware has been loaded (both free and non-free), and we will also store that information on the target system such that users will be able to find it later. The target system will also be configured to use the non-free-firmware component by default in the apt sources.list file. Our users should receive security updates and important fixes to firmware binaries just like any other installed software.

But Proposal A adds that “We will publish these images as official Debian media, replacing the current media sets that do not include non-free firmware packages,” while Proposal B says those images “will not replace the current media sets,” but will instead be offered alongside them.

And Proposal C? “The Debian project is permitted to make distribution media (installer images and live images) containing packages from the non-free section of the Debian archive available for download alongside with the free media in a way that the user is informed before downloading which media are the free ones.

Debian is not the more new user friendly system out there and a lot of distributions got popular because they took the Debian base and made it more userfriendly by allowing non-free drivers and firmware. So this is a good move in my opinion. Personally I feel that option B might be the best option that will keep both the purists and the reformers happy. I don’t think Option C is a good option at all as it would be confusing.

Source: Slashdot: Debian Considers Changing How It Handles Non-Free Firmware

– Suramya

August 26, 2022

Using MultiNerf for AI based Image noise reduction

Filed under: Computer Software,Emerging Tech,My Thoughts,Tech Related — Suramya @ 2:58 PM

Proponents of AI constantly come up with claims that frequently don’t hold up to extensive testing, however the new release from Google Research called MultiNerf which runs on RAW image data to generate what the photos would have looked like without the video noise generated by imaging sensors seems to be the exception. Looking at the video it almost looks like magic, and appears to work great. Best of all, the code is open source and already released on GIT Hub under the Apache License. The repository contains the code release for three CVPR 2022 papers: Mip-NeRF 360, Ref-NeRF, and RawNeRF.

TechCrunch has a great writeup on the process. DIYPhotography has created a video demo of the process (embedded below) that showcases the process:


Video Credits: DIYPhotography

I like the new tools to make the photographs come out better, but I still prefer to take unaltered photos whenever I can. The most alteration/post-processing that I do on the photos is cropping and resizing. That also is something I do infrequently. But this would be of great use to professional photographers in conditions that are less than optimal.

– Suramya

August 12, 2022

Multiple Linux Live CDs on a single USB Drive

Filed under: Computer Tips,Linux/Unix Related,Tech Related — Suramya @ 6:55 PM

Portable Boot disks are a life saver for a techie and I usually carry one with me most of the time (Its part of my keychain 🙂 ) However, the issue I would face was that I could only carry one live CD at a time on a USB stick and if I wanted another one then I would either have to search for the pendrive where I had already installed it or burn another one to the drive which was annoying, especially when I had to switch between OS’s frequently.

So I started searching for an alternative, something similar to the Ultimate Boot CD that allowed you to have multiple diagnostic tools on a CD but for Live Distros and installation media. Tried a bunch of ways but the easiest way I found was to use Ventoy to create a bootable USB.

You can download Ventoy from their GitHub Releases page, and the installation of the tool is as easy as extracting the file to a folder on your system and then running the correct executable for your system (They have executable’s for all architectures). Once you run the file as root, select the USB disk you want to use and click install. It takes about a minute for the software to install on the drive and once completed, it creates two partitions on the disk. The first partition named VTOYEFI is reserved for the boot files by Ventoy so ensure that you don’t change anything in that partition.

The second partition called Ventoy, is an exFAT partition and this is where we will copy all the ISO files for the distributions we want the disk to support. Installing a new OS/Tool/CD is as simple as copying the ISO file for the CD on to the partition. Once we have copied the files to the partition all you have to do is unmount the partition and your new disk is ready to use.

I installed the Debian Installer, Kali Live CD and Kali Installed on a 8GB drive with no issues. When I boot from the disk, I get a menu asking me to select the ISO I want to boot into and then the system boots into the boot menu for that image. So now I can carry one pen-drive with all the OS’s I would need to troubleshoot a system or reinstall the OS. I think you should be able to boot into windows installer as well using this method but I haven’t tried it yet so can’t confirm for sure.

Well, this is all for now. Will post more later.

– Suramya

Older Posts »

Powered by WordPress