Author Archives: shyam

About shyam

Network professional , Entrepreneur , technology enthusiast , still not a geek

Wi-fi enable your camera with Eye-Fi

                       I always wish to purchase a Wireless Transmitter for my Nikon . The freedom and flexibility offered by the wireless capability is great. But the price tag of approx. $750 keep me away from the professional solutions for Wi-Fi in my camera. I did not find it justifiable that my sub $100 phone come equipped with a Wi-Fi capability and my $ 600 DSLR does not offer any kind of Wi-Fi connectivity .

 

So I was really happy to see that  a creative concept from a company turned out to be a nice product that  can add Wi-Fi capability to any of your digital cameras, whether it is a point and shoot or DSLR.  The product is creatively named Eye-Fi . You can find them here 

 

 

 

The concept behind the eye-fi is quite simple . It included a  wireless transmitter to a regular SD- memory card. This modified memory card will look and function just like a regular SD-card.   So when you push  the memory card to the memory slot of your camera, camera will detect it as regular memory card. But the wireless transmitter will be powered from your camera and it will transmit the data  through the configured Wi-Fi network and to the configured folder on your system.  So that means the pictures will literally fly to your computer or the media server right after you clicks it.

 

All these features are available to approx. $45  will make it a worthwhile buy  for anyone who need the flexibility of a Wi-Fi for your camera but still does not wish to spend  big money for that .

 

Find more details about the operation of the Eye-Fi  , click on the image below

 

 

 

 

Turn your laptop in to a wi-fi hot spot

 

 

There  are numerous articles on internet which discuss the same topic , but curiously most of them are discussing on the third party application to set up your own wi-fi hot spot.  This post is an attempt to discuss the in built feature of windows 7 to convert your wired internet or 3g connection to a shared access point for your friends or business colleges .

 

Background :

Situation  # 1:    There are numerous occasions in which you need to through up your own wi-fi hot spot. May be your ADSL  or cable  internet connection does not come with a wi-fi enabled router but use  a modem that directly attaches to your desktop, but your tablet or smart phone only support a wi-fi access than the typical Ethernet  connection.

 

internet connection software is there in Microsoft products from windows 98 second edition onward . Mostly we use the ICS to share your internet over a wired network. But the same can be used to share your wired connection to a wireless network , converting your laptop or desktop to a wireless access point.

Situation # 2:  

 You have a 3G connection available as a USB dongle which you wish to share with your friends. Normally your high end routers have option to directly plug the 3G dongle to it, but normally it is a costly solution and these routers does not support all these dongles and anyway  our discussion here is to  avoid these routers altogether and use our laptop for the same function.

Current Scenario 

To experiment with setting up the wi-fi hotspot with build in feature of windows , we have done a basic setup with two wi-fi enabled  laptops and a idea net setter 3G dongle , So we are explaining the situation #2 . Instead of this 3G dongle you can also use the wired broadband connection as discussed in situation # 1

The sony vio laptop  installed with windows 7 ultimate is used to create a wi-fi hot spot.  An idea netsetter 3 g is installed on the machine and will show up as a connection in the    Control Panel\Network and Internet\Network Connections    refer to the above  image for the configuration details . The wired connection on the exhibit is disconnected . If a wired broadband is used  as discussed on the Situation  # 1: , the Wired connection will be enabled for the procedure .

Now select the properties of the 3 G connection  ( or the LAN if you are using a wired broadband ) 

Select  the sharing option and enable Inter connection sharing by checking  the  box on ” allow other network users to connect ..” option . This will enable the DHCP and NAT function on the connection we selected and the system will be ready to share the internet to the connection we are going to specify.

Now select the wireless connection as the preferred home networking connection . This will change the IP  address of the wireless adapter to 192.168.137.1   which is the range used for ICS  in windows 7 ( For the case of a ICS in  windows XP machine , the IP was 192.168.0.1) .

Now we have to convert our wireless connection to ad-hoc so that the the connection will be ready for receiving the users over the network .

go to network and sharing center and select the option to set up a new connection or network 

Now select the option to setup an ad-hoc network

Select the SSID , the preferred encryption method and the encryption key for the new wireless connection and finish the procedure

Now  your connection will be visible on the system tray area .

On the client side , we are using a XP machine . It does not matter which OS you are using on the client side .  select the wireless network named Desktopreality which is now showing on the wireless configuration window  as shown below , type in the security code you have given and the WiFi adapter will connect to the network on your windows 7 machine and ip address for the adapter will be supplied by the DHCP service of the ICS on windows 7 and you will get 192.168.137.2 as the IP .

and that’s it. Finally you have a wi-fi hot spot set up with out using any costly router hardware or any kind of third party software . The final result is  shown below .

Protect your Blog content with DMCA

 

 

 

Digital Millennium  copy right Act  ( DMCA )  may be controversial  from the  initial days of that act itself. This blog post is not an effort to check the merits and demerits of DMCA  , but rather considering the DMCA from a normal bloggers point of view .

 

What is DMCA ?   

Just as part of driving the point , The Digital Millennium Copyright Act (DMCA) is a United States copyright law that implements two 1996 treaties of the World Intellectual Property Organization (WIPO). It criminalizes production and dissemination of technology, devices, or services intended to circumvent measures (commonly known asdigital rights management or DRM) that control access to copyrighted works.  ( Source : WikiPedia  : Read the full details here )

 

Why you should care ?

As a technology writer , I know how hard it is to convert a  basic idea in your mind to words and write an article. English is not my primary language and that makes this process even harder.  So it is really painful to see that your hard work is just lifted by another blogger and pasted on their site with out even giving credit to the original author.  Newer technologies make it really easy for anyone to  infringe the copyrighted work and make it their own.  This kind of Plagiarism is often ignored by the original author because he does not know any effective method for preventing it . May be the  content thief  is residing on another country protected by the geographical factors and the laws applicable to his country.

Internet is not an anonymous world. It is really easier to check if anyone copied your content to their website . Just a Google search with  the suspected content will reveal who lifted your content.  But what action you are going to take against the person who lifted your content ?

Here comes the significance of DMCA . You can file a DMCA complaint against the person who copied your content . If  he is a blogger blogging on free platforms like blogger.com or wordpress , you have the option to file a report and they will take care of it. If you are accusing a company or person who runs a website with your content , you can file a DMCA with their hosting provider.  The hosting provider will take it seriously and will remove the suspected content if they found that the allegation is correct in prima-facie .

Another way of  protection involves letting know the would be content lifters that you are taking the matter of content copyright seriously by displaying the badges against  copyright violation on your site and tracking  your content regularly for any kind of reproduction on internet .

The primary site for it is dmca.com it self which make it easy for us to  do the DMCA Protection & Takedown Services.

This kind of sites  even make it easy for you to file a DMCA complaint on behalf of you. ( you can always file a DMCA by your own , but if you are subscribed to any of this services , use their facility)

 

download the badges for your site here   and place it on your website. Create a basic account  for you on the DMCA site  and use wordpress or blogger plugins to display badges and certificates of your posts .

 

 

 

dmca.org may not be the ultimate thing to protect your contents on internet . But at least you are telling the world that the content is your hard work and you will mind if someone  just do a copy and paste.

 

Data center virtualization – Basic aspects

 

 

                 Virtualization is the new buzz word in computing industry. Even though a normal customer of support engineer is exposed to this term recently, for the past few years it is one of the favorite word for the IT division of a corporate company.

Even if you are not much familiar with the typical concepts of data center virtualization, I am sure you have come across the ideas of virtualization through the use of soft wares like Virtual box, VMware workstation and Microsoft Virtual Machine.  Yes the same software you are using to run multiple OS on your system is also the basic concept behind the corporate level server virtualization.

 

 

 

Let us try to discuss some basic aspects of data center virtualization   and how it is different from your typical toys used  for  running virtual machines.

For any large organization which uses large number of computers , there are so many issues associated with these systems. In my personal experience most of the company management treat the IT division as an evil that eating  out the big chunk of revenues of the company .  The computer as well as the software installed on it demands constant upgradation process and the life span of a typical technology is relatively  small forcing the companies  to purchase new systems and peripherals  almost yearly.   In addition to that the power consumption of the system as well as the problems created with  e-waste and the environment issues associated with this systems are also a big issue.

To deal with such issues most of the companies now care about the power management aspects  by implementing green PC standards , use of LCDs instead if CRTs , and using the CPU throttling and similar power management aspects to cut down the power consumption of the systems .

 

But all these options for power reduction and energy saving will not be feasible for the datacenters of your company. Datacenter is the place where you keep all your servers in a secure and temperature controlled room. ( now a days it is more fashionable to say data center than calling  it a server room )

These servers on the datacenter manages the  data flow and companies expect the these servers  must have an availability level of almost 99.999 % . An average server  with raid controllers and multiple hard disk s  consumes more than 1 KW of electricity and typical data centers has hundreds of servers operating 24 hours a day 7 days a week .  We cannot implement any kind of power management aspects on these servers as the customers need to get the full performance of these server all the time .

 

 

 

The secondary issue is that all these machines converts portion of  the power it consumes to thermal energy and naturally heats up the data center . So we have to use high power air conditioners to maintain the temperature of the data center.

So it means that the server  consumes power and then  additional power is consumed to cope up with the temperature management . So even though we are not really thinking about it , the corporate data centers  are one of the main culprits for global warming  🙂

Next issue is that the cable management, server installation , disaster management system , all these things adds up to the total implementation and running cost of the data center.   So we cannot  blame your finance manager if he things IT division is the biggest  financial   burdens of the company .

The next big issue is that all these servers for which the company spend so much money and effort will not utilize even the 10 percent of its efficiency. i.e. it is just like the case of a highly paid staff in your company  who sits idle for  7 hours our of his 8 hour  work time . Any management will not tolerate such an employee but the same company maintains these lazy  servers  in the comfort of the air conditioned room .

 

Data center virtualization is the answer to almost all the issues mentioned above .

We can also call it server virtualization .  i.e. we are dividing a physical server to many virtual servers and run server operating system on each of these virtual servers .

The basic idea behind such virtual server is not really new. IBM mainframes implemented the basic version of this idea in early 60’s .  Microsoft used this concepts in their 1992 release of windows NT server to run virtual DOS machines and they are currently using the virtualization idea to give compatibility mode for XP  applications in windows 7 . But when this idea is developed further to meet the needs of a datacenter management rather than individual desktops , we call it as Data center virtualization .

 

Energy saving is not the only area where data center virtualization shows its magic. By downsizing the 100 or more physical servers in to 6 ot 7 physical machines , it saves the effort and cost of cabling and the investment of switching and routing equipment . Now a days even the hardware concepts like switching and routing are virtualized making the life of an IT admin more easier .

 

 

Now let us check  how the concept of data center virtualization is implemented .  In a typical physical machine the operating system isn installed on top of your microprocessor and on top of the operatring system , the application software is installed . .

 

 

This basic model is changed in virtualization model. Rather than the concept of  an operating system on top of a hardware , a layer comes in between your real hardware and operating system.

That means on top of the  hardware  a virtualization layer is installed . We can call this layer as hyper-visor . On top of this hyper visor layer you can create as many virtual computers as you need.  Each of these virtual machines has their own virtual processor , virtual memory , virtual hard disk and virtual networking .   The operating systems running on top of these virtual machines does not has a clue that they are running on a virtual environment . That means  you are virtualizing the hardware and not the  software .

Image Copyright (c)  vmware.com

 

Each of these virtual machines along with the operating systems and application software are a mere collection of files in real world. So  that means for creating your next virtual machine using these files you need only few minutes .

If your companies Finance division needs a server for their new project  , for the conventional physical machine approach will  few days to order new machine , install the operating system , cable it and do the testing before it is handed over to their use . Now all you need is few mouse clicks  to make the new server up an running and that too in just few minutes .   If you have a remote access to your data center on your smart phone , you can do this process  virtually from any part of the world.

 

There are two kind of virtualization concepts for data centers .  They are Host OS based virtualization and Bare metal virtualization.

In  Host  OS based virtualization , A typical operating system is installed in top of your hardware and then the virtualization layer is installed  and on top of that the virtual machines are created.

Bare metal virtualization    is more efficient than this concept because on that the virtualization layer is  directly installed on top of the hardware and the virtualization layer itself is acting the operating system and on top of that the virtual machines  are installed.

The operating system installed on top of this virtual machines are called guest operating systems .

 

VMware is the world leader as far as the bare metal virtualization industry is concerned .  The  basic products like ESXI free edition can be implemented with out any license  cost and can be used to explore the feasibility of virtualization in your company.

Naturally the enterprise class products comes with a price tag ( little bit high ) but really make sense if your  environment  that is mission critical .

The skilled man power for the Virtualization segment is low comparing with the demand of this fast expanding   market . In addition to the normal infrastructure admin , this concepts creates a new post named virtualization admin which is a blend of the basic IT infra skills , storage skills and virtualization skills.

 

This article is an attempt to cover the very basic concepts of datacenter vitalization .  More topics can be discussed based on the requirement from the  persons like you . Please use the comment box to have a more detailed  discussion on the same.

 

Find more details about the virtualization  training session here 

 

 

 

 

Why Wi-Fi is called so ?

 

 

May be you are accessing this page from your Wi-Fi  enabled device . But why your wireless connection is known by a fancy name – WiFi ? 

Background story

back in 1980 , February an IEEE  committee was formed to define the standards of networking .  That committee named  them self as 802 -sub committee  reminding us the year and month   of their formation .

The 802 sub committee defined  the standards for Wired Ethernet as  802.3 and later on modified it with 802.2 specs.

in the late 80’s and early 90’s the standards for wirless Ethernet standards were evolved and  known as 802.11 standards . So that means your typical wireless network must be called a 802.11 network.

 

But  why the Wi-Fi  ? 

The term wi-fi is not a technical term . It is expanded as as wireless fidelity , just like we use the term hi-fi to denote the high  fidelity music systems .

actually wi-fi is a  certification process for wireless inter-operability  conducted by a consortium named  wi-fi alliance for your 802.11 compatible devices .

It  means not all your 802.11 wirless equipment are wi-fi certified and hence can not be  officially called  a wi-fi device . This kind of a certification process came in to the field because the IEEE does not test any of these wireless products to find  if they meet IEEE specifications.  I personally think that IEEE has a real lousy approch towards their wireless standards in general and did not even care about the security aspects of the same.

Even though the wi-fi alliance claim themselves  to be non-profit consortium , they charge a hell of an amount for the certification process  so not every organization will apply for the wi-fi certified status for their products .

But now the term wi-fi become an alias name for the wirless networking so do not worry about calling your non- wifi certified  networking product as a wi-fi device

 

 

Cisco or Microsoft certification ?

One of the common question asked by a typical IT infrastructure student is that Which certification is better for me ? Cisco -Certification or Microsoft Certification ? and should I go for  CompTIA ?

There is no point in directly comparing the Cisco and Microsoft certifications .

First of all , achieving the certification is the  second step in your carrier plan. Before venturing  to get certified , make sure that you get qualified.

Microsoft offer a certification path in networking and Cisco offer on in inter networking. What is the big difference  ?

If networking is inter connection of two or more computers , then inter networking is inter connection of two or more networks. So it implies that to before you move to an inter -networking carrier path , make sure you are confident about networking .

Before you move on with your networking training , make sure that you have enough background in Basic IT infrastructure including Hardware skills and normal network implementation

 

Should I go for Certification from CompTIA  for my basic hardware skills ?

 

For developing your basic hardware skills and get a deep insight in to  the computing , the  A plus certification material from CompTIA is the best.    You can just follow the study program path from CompTIA but writing a CompTIA A + certification is not a good investment for your IT carrier. CompTIA is not offering any vendor specific certifications , so investing your hard earned money for a costly exam like A plus is not at all recommended.  I repeat , the study path by CompTIA is the best for developing IT  fundamentals but investing on certification from CompTIA is not at all a wise move .

 

Microsoft Certifications  will be your  first wise investment  towards a better carrier in networking .  Even if you are planning to move towards the Linux networking , the basic skill set with Microsoft products will always be an add on advantage  for your resume  .   The MCITP enterprise admin track will be the one you should go for .

If you wish to move towards the Linux segment , collect at least the Active directory Service knowledge and basic Infrastructure skills from the ADS and Infra papers from Microsoft and move on with  RHCE

 

 Cisco Certification  is not a basic certification in Networking . Even the CCNA  program from Cisco is expecting your strong networking skills to  qualify yourself to attend a  CCNA  exam.

So back to the point about if  Cisco or Microsoft certification is better ,  you should go for both if  you plan for a better carrier in Networking .

 

 

 

 

Your Neighbours’ Wi-Fi – Stay away from temptations

 

 

As a techno savvy person , you find more pleasure in hacking in to your neighbours wi-fi broadband than eavesdropping on them .  May the first thing you are going to do when you shifted to a new locality is to search for how many open  wi-fi connections are available so that you can  enjoy an online life without monthly bills from your ISP .

Some people argue that accessing an open wi-fi internet connection  is not really against any rules and regulations and it is quite different from breaking in to someones network.  But your neighbors technical illiteracy is not a valid  reason for you to access their internet connection without their permission. Most of the time this kind of  access is happening in  metro cities were the wi-fi density is high and you are surrounded by the temptations of so many open wi-fi access points .

If a user  forgets to protect their wi-fi access pint with a password  , it does not give you a right to break in to their network.  Even if you find a car unlocked with keys in place on some ones driveway , it will not give you a valid  reason to drive away with that car  .

Accessing a wi-fi network is not just about stealing the bandwidth of their internet connection , but also about accessing their personal data . If any data lose or theft happens  on your Neighbors systems and if he /she is going for any kind of cyber forensics , you are going to be the culprit as your mac id is in the router and will act as a clear evidence  against you .

So please stay away from   unauthorized   use of open wi-fi  you find on your vicinity 

 

Why you still Control-Alt-Delete

 

 

Hope some of you out here already thought why we are pressing this strange combination of Control -Alt- Del  when ever you have to log-on in a networked system or to do the end task when some programs hangs .

The basic story behind the control alt delete key combination dates back to to the original IBM -PC in 1981.

David Bradley is the man behind the use of this key combination in the original IBM -PC. He want to create a soft reboot option for the PC with out pressing the reset button or power button of the computer system. At that  time the 84 key keyboard design  was in use which does not have extra control and alt keys  as we are having in our 101 key keyboards  now .

 

So the basic selection criteria for these three keys are that a user will not accidentally  produce these three key combination with one hand.

He has to use both of his hands deliberately  to activate the soft reset option either in BIOS setup or in DOS prompt .

When Windows came to the market , the initial purpose of this three key combination was changed to end task process, and two control-alt-delete  combinations in a row will still restart the machine. It was the case  from windows 1.0 to windows 95, 98 and up to windows ME.

In the Windows NT family of operating systems from Windows NT 3.1 to the latest OS you are running on your systems , this three key combination is part of a  security process.  So when you are logging on to the system by pressing control -Alt-Del you are getting an asurance that all the previous TSR software that exists on the machine is end tasked or killed before your session starts . This three key combination is hard to emulate over a network , so the user is expected to be there in front of the machine to press the Control -Alt-Del ( yes you can remote desktop for your new generation OS but at that time also winlogon is instructing the GINA over the network)

So after all these years this three finger salute is still there in your machine and it still do  the job of soft reboot when your system is working in real mode without your OS.

 

How to verify your Windows Authenticity

This post is a response to a question asked by a user on the support site AskShyam.com .    This is the question asked by the user

      “hello sir could you please explain how to check if my wndows 7 is  genuine. I  have hp pavilion g4 1200 lap comes with windows 7 home premium,is this an original version?”

It is a common question by a typical user who purchase his desktop or laptop along with a legal operating system or a user who purchase a retail pack of the Microsoft product to enjoy the benefits of a legal user .

 

1. How can you verify that your product obtained from your hardware vendor is genuine and you are not cheated  to purchase a  pirated product  for a high cost ?

before you purchase the original version of windows  visit the Microsoft website for a deatiled check list to look for in a genuine windows pack . Click on the image below

2. I become a victim of software counterfeit . I purchased the product on the best faith that I purchased the legit software . What  should I do  ?

First of all it is your fault to purchase a counterfeit software . You have to observe  the steps mentioned above before you purchase. Still , if you have the purchase details and software CD’s with you ,    you can contact  Microsoft describing the situation and Microsoft is ready to offer a Complementary kit for your Windows family or Office family of products.     Please follow the steps below 

Visit  the link below                                  http://www.microsoft.com/en-us/howtotell/default.aspx

 

click the option mentioned in the image and follow the steps that appear . Once Microsoft decides that you are eligible for a Complementary kit , it will be shipped to you