Linux Won the Web

How Open Source & Linux Won the Web

On the desktop, for home and office users, Microsoft Windows still has the highest market share. The second-largest is Apple, and after that comes Linux, followed by BSD, Solaris, and a scattering of lesser Unix-like systems and other niche systems. So it may be surprising that as soon as you take computing off of the single-user desktop and onto the World Wide Web, the picture inverts. Free and Open Source Software is either in the top position, or rapidly rising.

Apache, the open source server software package, has the biggest market share – not just by a margin, but by a vast percentage. Linux, the most popular Unix-like system, has a sizable chunk of the server market as well as other communication devices such as routers, smartphones, and GPS systems. And even on the desktop, the Mozilla Firefox web browser has risen in just a few short years to take on between a fifth and a quarter of the user market, and it’s still growing. Other open source web browsers have chiseled away at the market too.

Linux & Open Source

More About Linux & Open Source

To see why the Internet and open source are such a good fit for each other, we first have to look at the origins of the Internet as we know it today. Before the mainstream explosion of the web in the 1990s, the Internet was still very much the domain of universities, colleges, laboratories, and governments. It was still largely under research, and that gives us our first clue, because the scientific community has long valued the open cooperation of its members. In the beginning, all software was open source, simply because nobody had the idea yet of charging money for it.

Sir Tim Berners-Lee, widely recognized as the father of the World Wide Web (before this there was just “internet”), created the first web server while working at CERN running his HTTP protocol. His system was NeXTStep – a derivative of Unix and the BSD operating system. The web browser which first popularized the World Wide Web was Mosaic by NCSA, built on a Unix system and released as open source.

So there was open source and Unix at the very beginning. In fact, both of the major web browsers in use today, Internet Explorer and Mozilla Firefox, can trace their roots directly back to the Mosaic web browser. As for the World Wide Web, one need only consider that to this day, directory paths for website URLs follow the forward-slash (/) convention of Unix file systems, rather than the back-slash () of DOS systems.

In addition, even Apple as we know it today owes some of its existence to Free and Open Source software, since the OS X operating system has BSD, another open source Unix-based system, running under the hood. So we have Linux, BSD, OS X, and then after that Solaris, which is also a Unix-like system and in open source form under Open Solaris. After that, the remaining stragglers bringing up the rear are all today either open source or Unix-related or both. So, in fact, Microsoft is the exception rather than the rule, though even Windows is built on top of DOS, which itself started out as a copy of CP/M, which – you guessed it! It descended from Unix.

The Internet and the World Wide Web, by its nature, is only useful if it is as free of constraint as possible, and allows every system from everywhere to cooperate under a united standard. And so the path of least resistance has simply been open source software, usually free, and the default system over the history of computers, which has been Unix and its derivatives, and the most popular Unix version today is Linux.

Furthermore, the nature of the web itself is open. If a visitor requests a web page from your server, both computers have to have a common communication standard, so there is no alternative but open source. You can still view the source code to any web page you visit, and in fact the secondary languages supporting the WWW system, such as CSS and Javascript, also are of an open source nature. The scripting languages on the back end of the server are likewise typically open source, including Perl and PHP.

No one could even imagine it as being otherwise. This is the truth that hides behind the current desktop dominance of a closed-source proprietary system – it is the exception, not the rule.

Australian Web Hosting

Australian Web Hosting

Australia has a variety of web hosting service provider’s base, on different platforms. This has expanded the options available locally for Australian web masters. The web space available depends on the different plans that are opted for by the clients.

Modern web hosting services provide free web site space, along with the regular site services. This is usually, as an add-on service to others, which the client is already using. Customers are free to use the space online and to upload, as many features as required. This has given a shot in the arm to host services that may not be the cheapest available. Customers are guaranteed the best services, along with the free space.

The latest offers by web hosts are a combination of options that have to be well sorted out by the client. Quantity of data transfer should not be compromised, while enhancing other features. Often the options include limited email accounts, limited MySql databases, along with a high data transfer range and smaller storage spaces. The client should have a clear understanding of the targeted audience and an estimated amount of data transfer. Clients, who deal individually through each customer, would find it difficult to manage on such servers, when the number of email accounts grows beyond certain limits. Small sites that are retailers for items may find the rate of data transfer overwhelming within a few months of launch. Hence, such clients must be on the look out to change their service plans, according to the situation. For sites that have fixed range of space occupied by web, information can avail of very high data transfer features.

Providing an SSL certificate, as a host to the client is the accepted standard practice. Some hosts offer the certificate for each customer on the sub-domain, as per their need, while it is advisable to have it for the main domain. Web hosts, which offer such certificates, have the best chances to attract customers. Retail e-commerce sites require SSL certificate, which is necessary for the confidence of the user. The enormous changes incorporated in websites over the time require an SSL certificate to be obtained after every change. The alternative is to link up with such certificate providers, who reissue them for free and for all the changes that occur.

Internationally, hosting rates have become far attractive than ever, at an average of between dollars 5 and 10 per month. However, the services may not be the best, considering the fact that all web masters seeking cheaper services do not care much about user friendliness. Considering that many components of a single web page are loaded after sending multiple HTTP requests, it would be a long time, before every web page can be successfully displayed. Often the services are not the best and could cost the customer more to make international calls to get their website up on the net again. Having a local host, saves on time zone differences that may affect performance during peak hours. Also, free service calls and prompt service may be better assured from local service providers.

With most of the western world being saturated and their networks under pressure, due to a highly dense population with Internet access, Australia becomes the best choice for webmasters looking for fast connectivity and maximum server up time. Even those service providers using virtual servers have to deal with breaks in international sea-link cables, maintenance, and repairs at many places within their network. Local service provider who charges more is still affordable rather than an international one, who cannot be accessed easily in case of problems. Besides, using the local services may also add to the sites page ranks, during organic searches.

Web hosting server location map

Server Location

The Internet is considered to be the fastest medium of communication that is available for general use all over the world. Hence, clients seeking web hosting services were initially well served through host servers located anywhere on the planet. However, the Internet revolution has occurred and chugged along in a lopsided manner. While there has been a rapid growth in computer density, Internet connectivity, bandwidth expansion, and information transfer in the Western world, the Eastern parts still play to catch up. This factor plays an important role in response time, efficiency, and security across the Internet.

The rapid advancement in Internet use in America and Europe has put an enormous pressure on their local networks. Websites that are located in both these continents are accessed vigorously by local as well as international clients. This reduces speed of data transfer and often leads to server crashes. There has been as attempt to reduce such overloads by introducing cloud-hosting techniques. This has been matched by the surge in computer and Internet density across Asia. A majority of servers involved in virtual networks are still located in the Western hemisphere, defeating their purpose.

With the growth of web hosting businesses, there has also been a rampant increase in unscrupulous hosts, who may indulge in plain cheating and theft of information. The absence of stronger laws regarding Internet use and applications has prevented development of the hosting industry in Asia. The archaic laws prevent expansion of bandwidth in most parts of Asia and hence web masters find it suitable to base their sites with servers in America or Europe. But, the number of routers over which users in the fast developing parts of Asia and Oceania are enormous. This greatly reduces speed of data transfer and impedes online transaction. Most of the southern countries like Australia are linked through cables through Asia. This negates the better bandwidth and infrastructure available in these countries.

It has been found that locally availed web hosting services may prove to be far efficient than those from overseas. The differences in time zones can be an impediment, when it comes to good customer service. Web masters all across the world are bound to rush to hosts offering the cheapest and best deals. This further creates a recipe for a logjam in the networks linking with these hosts.

Local hosts can provide better maintenance services at a quicker pace than those overseas. All across the world, web hosting charges are roughly between dollars 5 to 10. The local hosts, who charge high, may still prove to be better when compared to others who offer far cheaper services but without an assurance of consistent quality. Shifting from an international to a local host can be beneficial for some, who experience a “24 hour logjam”. Although many people expect that there will be a lull in night time Internet usage, they forget that there are equal number of netizens, who wake up on the other side of the world. Hence, often the only solution to ensure minimum chances of slow down is by localizing the server.

Location of server can also be used advantageously, if it is closer to the target audience, rather than the web master. This is the reason, why most search engines have number of servers spread across the world. Servers based in Asia pick up website links that cater mainly to that continent and similarly for others. For dedicated servers, localization can prove easier for web masters to incorporate improvements in the website functioning.

Web hosting cloud computing techniques

Advancements In Web Hosting

There has been huge advancement in web hosting business in terms of technology and the business processes involved. The launch of cloud computing techniques in February 2008 and rapid growth of virtualization method has been the notable features. The spurt in such services has been due to the insatiable demand for space.

Cloud computing involves the use of a huge number of servers across many locations. This removes the dependence on memory, performance and RAM of a single computer. The spread across servers guarantees an almost unlimited space for data storage or memory required for transfer. There is always an average range of web space between which a host’s computers are used everyday. This provides ample free space for users to exchange and store data. This method to host ensures purchase of unlimited computing power, as needed from an inexhaustible source. Also, any up-gradation or scaling down can be done without any shut down of the server. The hardware maintenance and replacements is unlikely to affect the overall speed and performance.

The load balancing required is done at the software level and is spread across multiple servers. Small and medium scale businesses that are unable to manage surges in traffic flow can effectively use cloud hosting. The system ensures an even spread of traffic load and prevents breakdown, due to an excess crowding onto a single source. The failure of any server is also well compensated by others, thus ensuring an uninterrupted flow of information.

Virtualization technology consists to run multiple small servers within the setup of a single machine. The softwares that allow the segregation, support and reduce loss, due to overlapping hardware have evolved over the time. The latest in the row is the Windows Server 2008 system. Virtualization is essential for hosters because it makes the service economical by increasing server utilization. It is found that most servers are utilized at a far less capacity than initially estimated. Virtualization uses this method to ensure 100% use of server space. The use of best performing content delivery networks like Akamai, Level3, and LimeLight enable best services for clients.

The Windows Server 2008 integrates new technologies like, virtualization tools, and security enhancements and saves time and costs. It is equipped with the Hyper-V technology that enables web hosts to create virtual machines out of their servers. Also, the latest web server, IIS 7.0, is included, which is equipped to handle web based services with ease. IIS includes facilities like web server, SMTP server and and NNTP server.

Hosting companies may use Linux or Unix operating systems on their web servers. Among the subtypes Red Hat Linux, or Mandrake Linux, are the proven softwares that can manage hundreds of websites and millions of hits per day. Among the UNIX, versions FreeBSD and OpenBSD are commonly used. Linux, Unix, and Windows can accommodate add-ons such as, chat, email and web statistics. Windows is far expensive because of the initial purchase cost and additional charges for license renewal.

Use of Linux and Unix is also easy, when it comes to cloud hosting or virtualization. These operating systems allow a close coordination between all the components of the network. The choice must be based on the web master’s need to run other features. UNIX is best for supporting PHP and MySQL. Windows is the best platform for applications based on .NET and VB. Windows is also needed to use Active Server Pages, Front Page Extensions and Visual Basic.

Control panels

Control Panels

While looking up for web hosting services, web masters must obtain a clear understanding of the nature of their proposed website and the type control, they would like to have over the possible changes to be incorporated. Websites differ in the type of upgradation required over the time. Some may have a constant requirement of textual changes, while others may incorporate photographs, graphics, and so on. Control panels offered are varied according to the packages that are subscribed to. All of the packages offer some basic components that are mandatory for the client. Options to add new domains, password change and addition of new accounts are some of the main features.

Cpanel

cPanel (control panel) is a graphical web-based control panel to simplify administration through the changes in the interface. The software is not freely available and to be paid through a monthly subscription. CPanel is versatile enough to run on PHP FreeBSD, Redhat Enterprise Linux, CentOS, and Windows Server. Since cPanel has become a commonly accepted tool for web hosting, a web master will be at an advantage to install and be linked to the website. Cpanel provides statistics tool that covers all details like disk space being used, domain name, bandwidth changes every month, e-mails, secondary domains and SQL database. Managing FTP data and editing CGI scripts is easier with cPanel. Fantastico Deluxe is supported by cPanel and allows web masters to install 50 or more free scripts without any specific programming required. Switching the interface to suit specific needs becomes easier with cPanel. A cPanel enabled computer allows the webmaster to access the website and upload, edit or delete files. The tool offers a provision for backup of files, in case of a system crash or accidental deletion of important data. Backup of mailing lists and e-mail information are also available.

Plesk 8.6

Plesk control panel allows setting up new sites, mail accounts, and DNS entries. Plesk control panels are available to work with POSIX platform, SUSE, FreeBSD, Windows Server 2003 or 2008, Red hat Linux. The tool helps to create templates for client and sites. Plesk control panel is usually used along with Windows Server. Since Windows Server plans do not come with a control panel, as SSH/Shell Commands is used to access the server. The control panel allows working via Shell access, but in an automated method at the click of a mouse. It is one one of the most secure and stable tool for website maintenance. It works well with both Windows and Linux, avoiding the need for changes to the operating system. It is highly compatible with Windows Server 2008 software.

While it is not mandatory to have a control panel from a host, it is better, since managing the website externally becomes unwieldy. cPanel and Plesk are reliable, quick and secure. The major differences arise in terms of the user interface. Changing one’s host may make it difficult to continue with tha same control panel. The long-term usage charge is less for cPanel as compared to Plesk. CPanel is the most compatible with almost all web hosting tools and is much user-friendly. If an unlimited domain license is bought, Plesk is much cheaper than cPanel. Plesk is compatible with SUSE Linux, while cPanel is not. XML interfaces are not provided with cPanel, while Plesk supports XML and allows PHP or Perl scripts. A good control panel is useful to optimize the website for SEO ranking and incorporating key changes for interaction with the user.

Web database

Web Databases

A website has a lot of information that has to be accumulated from different sources to present a whole picture. The importance of databases in the working of a website cannot be understated. A website needs to access a database for recording and generating orders, customer information, and product specifications.

There are a number of databases that are available for the purpose, depending on the type and volume of data that may be presented. Microsoft Access is one of the simplest databases available. The software is relatively inexpensive and may be offered free by web hosts. It is easy to operate and master for average skilled web masters. MS Access provides highly flexible data management solutions in the market. The user friendly tools are helpful for smaller websites in the beginning to manage their data in a simpler way. The advantage is that the database can be converted into MS SQL format, whenever the need arises.

MS Access database is best suited for desktop applications, serving a restricted amount of users at a time. Access offers compatibility with the systems used by home based users. Sending copies of the database to individual clients is easier with Access. Access being commonly used by most people, it is not necessary to install SQL on all user-machines. MS Access is a local database that cannot be accessed from remote locations. The intuitive GUI system enables quickly design applications. Any modifications to the database is done locally and uploaded to web server. This becomes cumbersome after the size of information crosses certain limits, leading to longer processing time. MS Access offers limited security for the database content than other systems.

MS SQL is a more robust database which is web server based and designed to manage access to multiple users, which is not possible by MS Access. Even in case of server crashes, data recovery is greatly enhanced. Database driven websites prefer it over Access. It requires development tools like VB, .NET Visual Studios, and other front-end. MS SQL might prove to be costly for database hosting. MS SQL is very much reliable in terms of performance. The data manager is an excellent tool to manage information in the database. The data exchange is quick and efficient and includes filtering of unnecessary traffic. MS SQL being a proprietary product, the cost involved in license fees and maintenance by certified technicians may be huge. MS SQL allows the web master to manipulate changes in the database from any location. The software allows scheduling of tasks, set alerts, incorporate security accounts, and manage data transfer between different sources. SQL Server offers multiple authentication methods for different users like login into the server, or database, and into each module within the database. MySQL encompasses multiple storage systems and allows the web master to select the most appropriate one for each table. MySQL is most preferred by professionals because of the number of storage engines. These include Falcon, Memory (heap), Federated, Archive, Merge, CSV, Blackhole, Cluster, MyISAM, BDB, EXAMPLE and Maria. There are developer communities who are trying to customize MySQL to develop their own unique storage engine. MySQL allows commit grouping so that it can increase the amount of commits per second from multiple transactions.

MSSQL Server and MySQL Server are the preferred databases because of the features that allow manipulation, securing and management of data. However, the lack of compatibility for some key database features in MySQL’s requires an interface to be developed.

Web hosting tools

Web Hosting Tools And Services

Earlier web hosting was meant to hire computer memory for ones website. However, with advances in technology and decrease in hardware prices, hosts have been providing increasingly ready made features, to attract potential clients. Microsoft Windows is one of the operating systems commonly used on servers by web hosts. It supports advanced technologies like Cold Fusion that are required to create and maintain dynamic websites. Managed host servers may also provide services like ASP, ASP.net Access, and MS SQL support to maintain websites.

Linux and UNIX servers are meant for clients, who require websites based on open source technology, such as PHP and MySQL. It ensures a better data security compared to other technologies. The easy availability, and open knowledge about the codes help in upgradation and customization. Unlike windows, it does not require renewal of license frequently.

There are list of tools that have become an accepted standard for web host. These include CGI, Ruby (RoR), Perl, PHP, MYSQL, and so on. These enable the web master to maintain and upgrade their websites without having to worry about unsupportive platforms. Often, hosts provide easy to set up functions that are common to all websites these include incorporating PayPal services, mail servers, virus prevention, and so on.

The advent of virtualization has rapidly expanded the number of options before the client. These include a combination of features put in together to form a package. Each package varies from the others in terms of cost, speed, disk space, data transfer speed and other additional features. Although, finding the ideal one may not be easy for a significant number of web masters, there are some simple points that may help to identify the best. The web master has to identify the possible amount of traffic that is likely to be attracted. The amount of information that is likely to be stored and the frequency and nature of up- gradation has to be decided beforehand. A web site with few, but highly interesting pages requires less disk space, and also requires a high bandwidth and faster data transfer capacity. Sites that require user registration usually cater to a specific and targeted audience. These require a secure transfer of information at a consistent speed. Such sites may have a fixed number of email accounts and data content at any given time. They work well with more disk space and comparatively average data transfer speeds.

The webmaster must decide the extent of content that will appear on the site and the amount of interactivity that is expected. The packages on offer are designed to suit a majority of websites. These include sites with a huge number of pages and complexity that cater to a fixed number of clients. These can take up packages providing large disk space, with limited email accounts, and with limited data transfer rates. Sites, which are used as online stores may witness a huge traffic. These are better served with relatively small disk space, fast data transfer, unlimited email accounts and so on. While virtualization has improved customer service, risks such as theft of data and security breaches into private and personal information databases have to be monitored.

Success of web sites depends on how easily and quickly users reach them in their quest for information. Many web hosts also offer services that improve the image of websites in the evaluation criteria of search engines. Spreading marketing strategies across numerous domain names on many IPs, and class Cs increases online exposure. These hosts provide multiple Class C IP addresses with each web account, thus increasing the chances of better rankings for the web site. These hosts are especially helpful for those web masters, who do not have the requisite resources to work towards SEO optimization, after the initial designing.