Thursday, October 22, 2009

Shared web hosting service

A shared web hosting service or virtual hosting service or derive host refers to a web hosting service where many websites reside on one web server connected to the Internet. Each site "sits" on its own partition, or section/place on the server to keep it separate from other sites. This is generally the most economical option for hosting as many people share the overall cost of server maintenance.

The hosting service must include system administration since it is shared by many users; this is a benefit for users who do not want to deal with it, but a hindrance to power users who want more control. In general shared hosting will be inappropriate for users who require extensive software development outside what the hosting provider supports. Almost all applications intended to be on a standard web server work fine with a shared web hosting service. But on the other hand, shared hosting is cheaper than other types of hosting such as dedicated server hosting. Shared hosting usually has usage limits and most hosting providers have extensive reliability features in place. [1]


Shared hosting typically uses a web-based control panel system, such as cPanel, Ensim, DirectAdmin, Plesk, InterWorx, H-Sphere or one of many other control panel products. Most of the large hosting companies use their own custom developed control panel. Control panels and web interfaces can cause controversy however, since web hosting companies sometimes sell the right to use their control panel system to others. Attempting to recreate the functionality of a specific control panel is common, which leads to many lawsuits over patent infringement.[2]

In shared hosting, the provider is generally responsible for managing servers, installing server software, security updates, technical support, and other aspects of the service. Most servers are based on the Linux operating system and LAMP (software bundle), which is driven by the low cost of open source software. But some providers offer Microsoft Windows-based or FreeBSD-based solutions. For example, the Plesk and Ensim control panels are both available for two operating systems, Linux and Windows. Versions for either OS have very similar interfaces and functionality, with the exception of OS-specific differences (for example: ASP.NET, SQL Server and Access support under Windows; MySQL under Linux).

There are thousands of shared hosting providers in the United States alone. They range from mom-and-pop shops and small design firms to multi-million-dollar providers with hundreds of thousands of customers. A large portion of the shared web hosting market is driven through pay per click (PPC) advertising or Affiliate programs.

Shared web hosting can also be done privately by sharing the cost of running a server in a colocation centre; this is called cooperative hosting.

Obtaining Hosting

Web hosting is often provided as part of a general Internet access plan; there are many free and paid providers offering these services.

A customer needs to evaluate the requirements of the application to choose what kind of hosting to use. Such considerations include database server software, scripting software, and operating system. Most hosting providers provide Linux-based web hosting which offers a wide range of different software. A typical configuration for a Linux server is the LAMP platform: Linux, Apache, MySQL, and PHP/Perl/Python. The webhosting client may want to have other services, such as email for their business domain, databases or multi-media services for streaming media. A customer may also choose Windows as the hosting platform. The customer still can choose from PHP, Perl, and Python but may also use ASP .Net or Classic ASP.

Web hosting packages often include a Web Content Management System, so the end-user doesn't have to worry about the more technical aspects. These Web Content Management systems are great for the average user, but for those who want more control over their website design, this feature may not be adequate.

Most modern desktop operating systems (Windows, Linux, Mac OS X) are also capable of running web server software, and thus can be used to host basic websites.

One may also search the Internet to find active webhosting message boards and forums that may provide feedback on what type of webhosting company may suit his/her needs.

Types of hosting

Internet hosting services can run Web servers; see Internet hosting services.

Hosting services limited to the Web:

Many large companies who are not internet service providers also need a computer permanently connected to the web so they can send email, files, etc. to other sites. They may also use the computer as a website host so they can provide details of their goods and services to anyone interested. Additionally these people may decide to place online orders.

  • Free web hosting service: Free web hosting is offered by different companies with limited services, sometimes advertisement-supported web hosting, and is often limited when compared to paid hosting.
  • Shared web hosting service: one's website is placed on the same server as many other sites, ranging from a few to hundreds or thousands. Typically, all domains may share a common pool of server resources, such as RAM and the CPU. The features available with this type of service can be quite extensive. A shared website may be hosted with a reseller.
  • Reseller web hosting: allows clients to become web hosts themselves. Resellers could function, for individual domains, under any combination of these listed types of hosting, depending on who they are affiliated with as a provider. Resellers' accounts may vary tremendously in size: they may have their own virtual dedicated server to a collocated server. Many resellers provide a nearly identical service to their provider's shared hosting plan and provide the technical support themselves.
  • Virtual Dedicated Server: also known as a Virtual Private Server (VPS for short) divides server resources into virtual servers, where resources can be allocated in a way that does not directly reflect the underlying hardware. VPS will often be allocated resources based on a one server to many VPSs relationship, however virtualisation may be done for a number of reasons, including the ability to move a VPS container between servers. The users may have root access to their own virtual space. This is also known as a virtual private server or VPS. Customers are sometimes responsible for patching and maintaining the server.
  • Dedicated hosting service: the user gets his or her own Web server and gains full control over it (root access for Linux/administrator access for Windows); however, the user typically does not own the server. Another type of Dedicated hosting is Self-Managed or Unmanaged. This is usually the least expensive for Dedicated plans. The user has full administrative access to the box, which means the client is responsible for the security and maintenance of his own dedicated box.
  • Managed hosting service: the user gets his or her own Web server but is not allowed full control over it (root access for Linux/administrator access for Windows); however, they are allowed to manage their data via FTP or other remote management tools. The user is disallowed full control so that the provider can guarantee quality of service by not allowing the user to modify the server or potentially create configuration problems. The user typically does not own the server. The server is leased to the client.
  • Colocation web hosting service: similar to the dedicated web hosting service, but the user owns the colo server; the hosting company provides physical space that the server takes up and takes care of the server. This is the most powerful and expensive type of the web hosting service. In most cases, the colocation provider may provide little to no support directly for their client's machine, providing only the electrical, Internet access, and storage facilities for the server. In most cases for colo, the client would have his own administrator visit the data center on site to do any hardware upgrades or changes.
  • Cloud hosting: is a new type of hosting platform that allows customers powerful, scalable and reliable hosting based on clustered load-balanced servers and utility billing. Removing single-point of failures and allowing customers to pay for only what they use versus what they could use.
  • Clustered hosting: having multiple servers hosting the same content for better resource utilization. Clustered Servers are a perfect solution for high-availability dedicated hosting, or creating a scalable web hosting solution. A cluster may separate web serving from database hosting capability.
  • Grid hosting: this form of distributed hosting is when a server cluster acts like a grid and is composed of multiple nodes.
  • Home server: usually a single machine placed in a private residence can be used to host one or more web sites from a usually consumer-grade broadband connection. These can be purpose-built machines or more commonly old PCs. Some ISPs actively attempt to block home servers by disallowing incoming requests to TCP port 80 of the user's connection and by refusing to provide static IP addresses. A common way to attain a reliable DNS hostname is by creating an account with a dynamic DNS service. A dynamic DNS service will automatically change the IP address that a URL points to when the IP address changes.

Web hosting service

A web hosting service is a type of Internet hosting service that allows individuals and organizations to make their own website accessible via the World Wide Web. Web hosts are companies that provide space on a server they own or lease for use by their clients as well as providing Internet connectivity, typically in a data center. Web hosts can also provide data center space and connectivity to the Internet for servers they do not own to be located in their data center, called colocation.

service Scope!

The scope of hosting services varies widely. The most basic is web page and small-scale file hosting, where files can be uploaded via File Transfer Protocol (FTP) or a Web interface. The files are usually delivered to the Web "as is" or with little processing. Many Internet service providers (ISPs) offer this service free to their subscribers. People can also obtain Web page hosting from other, alternative service providers. Personal web site hosting is typically free, advertisement-sponsored, or cheap. Business web site hosting often has a higher expense.

Single page hosting is generally sufficient only for personal web pages. A complex site calls for a more comprehensive package that provides database support and application development platforms (e.g. PHP, Java, Ruby on Rails, ColdFusion, and ASP.NET). These facilities allow the customers to write or install scripts for applications like forums and content management. For e-commerce, SSL is also highly recommended.

The host may also provide an interface or control panel for managing the Web server and installing scripts as well as other services like e-mail. Some hosts specialize in certain software or services (e.g. e-commerce). They are commonly used by larger companies to outsource network infrastructure to a hosting company.

Hosting reliability and uptime

Hosting uptime refers to the percentage of time the host is accessible via the internet. Many providers state that they aim for at least 99.9% uptime (roughly equivalent to 45 minutes of downtime a month, or less), but there may be server restarts and planned (or unplanned) maintenance in any hosting environment, which may or may not be considered part of the official uptime promise.

Many providers tie uptime and accessibility into their own service level agreement (SLA). SLAs sometimes include refunds or reduced costs if performance goals are not met.


Saturday, September 26, 2009

Business Telephone Systems

Phone Systems can make a significant difference in the growth and development of a business. So keeping in mind this important aspect it is very important to choose an appropriate phone system.

Phone systems are designed according to the requirement of the office . Now-a-days six line Phone systems for small businesses to larger Phone Systems capable of handling hundreds of phone lines at one time are available. Before purchasing a Phone System you should know the particular needs of the company. If you have some knowledge of Phone installation and programming then it is more profitable to buy an unassembled Phone system because it is cost effective. Fully scalable Packaged phone systems such as the SAMSUNG iDCS 100 are also available with all parts fully assembled and also can be configured according to your exact specifications. In telecommunications VoIP technology is the most advanced which use internet telephony for communications and is suitable for all types of businesses because it its inexpensive and scalable.

Small Plug and play telephone systems are suitable for businesses with 3 to 16 employees. Each Plug and Play telephone system is assembled and programmed in our office and the systems are backed by our in-house technical support and help desk. Remote programming is also done by our well experienced staff via GoToMeeting so no security risks are involved by connecting your phone system to your network.Excellent maintenance plans and technical support is provided by us.

Organisations with 16 to 48 employees usually choose medium plug and play telephone systems. The medium sized telephone systems start with the capability of 8 outside telephone lines upto 16 digital extensions, voice mail and 6 telephone terminals. Any small business owner can implement and maintain their own telecommunications solution with our help as we have selected medium phone systems that are easy to set up,install and use.

google_protectAndRun("ads_core.google_render_ad", google_handleError, google_render_ad);
Large Telephone systems such as NEC DSX-160 Telephone system package are ideal for large businesses.You can also ensure that you have proper configuration and equipment by calling our technical support staff.If you want to save your precious money ,then bulk purchasing can be a great idea.

iDCS telephones are called 'keyphones' or 'keysets'. It offers many automatic and programmable features. The iDCS 28D keyset has 28 programmable keys,the 18D keyset has 18, the 8D and 8S keysets have 8.All but the 8S keysets have liquid crystal display for displaying call information, feature menus and so on.

To become familier with the operations of your keyset you are required to study the iDCS keyset user guide. Every day telephone communications will be an easy and interesting task after learning to use your keyset correctly. Your iDCS keyset is the most visible part of your telephone system and telephone calls are handled the same way irrespective of the type of model keyset you are using. The 28D,18D model keysets have additional conveniences that are not available to 8D keyset users.

What is Linux

Linux is an operating system that was initially created as a hobby by a young student, Linus Torvalds, at the University of Helsinki in Finland. Linus had an interest in Minix, a small UNIX system, and decided to develop a system that exceeded the Minix standards. He began his work in 1991 when he released version 0.02 and worked steadily until 1994 when version 1.0 of the Linux Kernel was released. The kernel, at the heart of all Linux systems, is developed and released under the GNU General Public License and its source code is freely available to everyone. It is this kernel that forms the base around which a Linux operating system is developed. There are now literally hundreds of companies and organizations and an equal number of individuals that have released their own versions of operating systems based on the Linux kernel. More information on the kernel can be found at our sister site, LinuxHQ and at the official Linux Kernel Archives. The current full-featured version is 2.6 (released December 2003) and development continues.

Apart from the fact that it's freely distributed, Linux's functionality, adaptability and robustness, has made it the main alternative for proprietary Unix and Microsoft operating systems. IBM, Hewlett-Packard and other giants of the computing world have embraced Linux and support its ongoing development. Well into its second decade of existence, Linux has been adopted worldwide primarily as a server platform. Its use as a home and office desktop operating system is also on the rise. The operating system can also be incorporated directly into microchips in a process called "embedding" and is increasingly being used this way in appliances and devices.

Throughout most of the 1990's, tech pundits, largely unaware of Linux's potential, dismissed it as a computer hobbyist project, unsuitable for the general public's computing needs. Through the efforts of developers of desktop management systems such as KDE and GNOME, office suite project OpenOffice.org and the Mozilla web browser project, to name only a few, there are now a wide range of applications that run on Linux and it can be used by anyone regardless of his/her knowledge of computers. Those curious to see the capabilities of Linux can download a live CD version called Knoppix .
It comes with everything you might need to carry out day-to-day tasks on the computer and it needs no installation. It will run from a CD in a computer capable of booting from the CD drive. Those choosing to continue using Linux can find a variety of versions or "distributions" of Linux that are easy to install, configure and use. Information on these products is available in our distribution section and can be found by selecting the mainstream/general public category.

SQL Server Replication

Replication is a set of technologies for copying and distributing data and database objects from one database to another and then synchronizing between databases to maintain consistency. Using replication, you can distribute data to different locations and to remote or mobile users over local and wide area networks, dial-up connections, wireless connections, and the Internet.

Transactional replication is typically used in server-to-server scenarios that require high throughput, including: improving scalability and availability; data warehousing and reporting; integrating data from multiple sites; integrating heterogeneous data; and offloading batch processing. Merge replication is primarily designed for mobile applications or distributed server applications that have possible data conflicts. Common scenarios include: exchanging data with mobile users; consumer point of sale (POS) applications; and integration of data from multiple sites. Snapshot replication is used to provide the initial data set for transactional and merge replication; it can also be used when complete refreshes of data are appropriate. With these three types of replication, SQL Server provides a powerful and flexible system for synchronizing data across your enterprise.

In addition to replication, in SQL Server 2008, you can sychronize databases by using Microsoft Sync Framework and Sync Services for ADO.NET. Sync Services for ADO.NET provides an intuitive and flexible API that you can use to build applications that target offline and collaboration scenarios. For an overview of Sync Services for ADO.NET, see Microsoft Sync Framework. For complete documentation, see this MSDN Web site.

Thursday, August 27, 2009

Speaker Series on Visual Networking Index

The Cisco Visual Networking Index (VNI) is an ongoing Cisco initiative designed to forecast, track, and analyze IP networking growth and trends worldwide. Cisco has developed a forecast methodology based on custom modeling tools and analysis using inputs from a variety of independent analyst data.

Cisco recently released a significant VNI Forecast and Methodology update covering 2008-2013. This includes new findings and usage trends for consumer, business, and mobile segments.
Here are just a few of the top-level findings from the updated research:
* By 2013, annual global IP traffic will reach two-thirds of a zettabyte (673 exabytes). (Last year’s forecast anticipated a run rate of 522 exabytes per year in 2012.)* Global IP traffic will quintuple from 2008 to 2013.

By 2013, the sum of all forms of video will exceed 90% of global consumer traffic
* Global consumer Internet video traffic will increase at a 39% CAGR from 2008 to 2013.* By 2013, the Internet will be 4 times larger than it is in 2009.
You can replay this 60-minutes webcast to learn more and have an interactive dialogue with Cisco representatives on VNI.

Speaker Series on Smart Connected Communities, A Cisco Globalisation Initiative

This Speaker Series webinar introduced and discussed Cisco's latest initiative driven out of our Globalisation Centre East headquarters in Bangalore called Smart Connected Communities. 500 million people will be urbanised over the next five years. 100 new one million-plus cities will be built by 2025. Trillions of dollars in stimulus packages have been announced, much of which will go into infrastructure. And the resulting environmental impact of this massive urbanisation will be significant; already the top 20 megacities use 75 percent of the world's energy. The ability to sustainably balance social, economic and environmental resources is more urgent than ever before.

The Smart Connected Communities initiative provides a blue print for how Cisco aims to capture this market transition through the development of visionary vertical solutions and new global ecosystems.

A Closer Look At the Cisco NAC Profiler

This is another one of my online experiments. As long as it appears useful I am going to track the “significant issues” about the Cisco NAC Profiler raised by other analysts, journalists and vendors; continue to collect data and arguments; and strive to clearly separate fact, opinion and bias. The goal is to help readers better understand what the NAC Profiler CAN and Can NOT do for them in their own particular networks and organizations. If this “goes well” I will extend this idea to other products. You and the referenced authors are encouraged to comment and raise new issues and perspectives.

Issue 1: Standalone Profiler?

“The products actually do good things in a Cisco context, except that NAC Profiler requires the NAC Appliance. The discovery and reporting concept is important enough to stand by itself, and what good is NAC Appliance going to do for a printer or phone or physical security system anyway? Cisco screwed up the initial NAC release by requiring a complete network refresh, now Cisco introduces the NAC Profiler that requires the additional expense of a NAC Appliance infrastructure. They should unbundle the network profiler, and expose its ability to move up the stack to detect servers. (Source: Eric Ogden, Security Analyst, Ogren Group - original post).
CORRECTED FINDINGS: According to Cisco customers CAN purchase a standalone “profiler” from Great Bay Software and operate it without the Cisco NAC Appliance. What does this mean? (1) Cisco will NOT sell and support this system. (2) It will passively collect data from endpoints (i.e. type, location, and behavioral attributes) and data about endpoints from Netflow-enabled network devices (i.e., network mapping, an SNMP trap receiver/analyzer, passive network analysis, and an active inquiry) and store all data in a device inventory database. (3) It will NOT automatically block either unauthorized or misbehaving devices as these functions require integration with the NAC Appliance. (4) You will need hardware to run the Profiler Collectors which otherwise would be installed on the Cisco NAC Appliance.

Issue 2: It’s simply an OEM Product

“Since the NAC Profiler is just an OEM of the Great Bay software, users could choose to deploy it in isolation. I see this more as Cisco trying to make the NAC Appliance more functional, and struggling at it. (Source: Michelle Mclean, Product Marketing Manager, Consentry - original comment)

FINDINGS: Integration of the NAC Profiler with the NAC Appliance automates the detection and blocking of unauthorized and misbehaving non-authenticating devices. The two management interfaces are also integrated so both data sets are presented in a single interface on the NAC Manager. In the Cisco edition of the GBS software the Profiler collection engine is co-resident on the NAC Application Server eliminating separate collector servers. And finally, the customer enjoys Cisco support worldwide.

Issue 3. More Value Than NESSUS?

Would having the NAC Profiler by itself be interesting, meaningful, or valuable? Not if all it does is repeat what Nessus or other tool’s already do. But if it does a lot more (besides telling you a printer can’t do 802.1X authentication) then that might be interesting. (Source: Mitchell Ashley, CTO & GM, StillSecure - original post).

FINDINGS: NESSUS simply scans the endpoint as it is a vulnerability detection tool. In contrast, NAC Profiler scans the endpoint AND collects a lot of data about endpoint behavior through a combination of DHCP snooping, SNMP traps, Netflow data, and SPAN port monitoring. It’s mission is to detect aberrant behavior which can mean an attack is already underway. Read the Cisco NAC Profiler Installation and Configuration Guide for details.

Issue #4: Is NAC Profiler Effective?

“It’s an interesting feature but the big unknown is how accurate Profilers’ discovery and classification is. We have never tested Great Bays software so we can’t speak to its accuracy, having tested all manner of passive discovery devices over the years, we have found that the classifications were usually accurate but not 100%. Often not even 75% and sometimes less.” (Source: Mike Fratto, Network Computing - complete analysis)

FINDINGS: No behavioral-based security system is “100% accurate” as it looks at multiple data points and then estimates the likelihood a specific event is concurring with an assigned level of confidence (not certainty) - think about all the network and host intrusion prevention software deployed around the world. So Mike is simply raising an unresearched potential issue. Since Great Bay’s customers are enthusiastic about this product, GBS must be doing something right!
Mike, I recommend you discuss your concern with GBS and report what you learn. You owe them that courtesy after “casting a shadow” on their product.

Upcoming Network Computing NAC Product

Now some good news (possibly). Network Computing (NC) has announced plans to publish “rolling NAC product reviews” based on their comprehensive testing of NAC products. So why is this important (maybe)? Because NC has a relatively good reputation for evaluating technical network requirements and products. Does this naturally extend to NAC? Maybe. Maybe not. The jury (us) is still awaiting evidence and expert testimony.

First, some background. Every online “magazine” is currently trying - often desperately - to carve out a market position as a “major source” of information on NAC (network admission control and network access control). Network World (NW), for example, offers “NAC Cram Session”, a currently weak collection of content of uneven quality and timeliness stitched together under an awful name. And recently NC announced its new “NAC Immersion Center. So far, like the other publications this is largely a repackaging and re-branding exercise with a promise of better things to come.

But with NC we have some basis for expecting more. Mike Fratto is knowledgeable, well-intended and humble (I admit, this is secondhand knowledge) and NC does historically act according to a seemingly higher journalist standard than many other network publications. So there is a solid basis for hope. But optimism?

So what will it take for NC to earn its stripes in NAC coverage? With its upcoming NAC reviews it has created an opportunity to succeed and fail, and its readers should set a high standard for quality to judge how well NC performs. Here are the thoughts on what I would say to Mike if he cared to listen to my lone voice. I welcome yours.

NC needs to publish a detailed test plan so everyone understands what they are evaluating, why, what would satisfy/please them, and some idea of how important NC views each capability. The absence of this information severely weakened the recent Network World NAC product reviews. NC should avoid this amateurish mistake.

NC should review its test plans with its readers BEFORE publishing its test results and analysis so there is a better chance readers will appreciate and consider NC’s frame-of-reference BEFORE being distracted by NC’s judgments of specific NAC products.
I encourage NC to resist the “irresistible urge” to publish numerical scores as these are most often a disservice to vendors and potential buyers. Instead, please focus on spreading actual knowledge rather than scores.

Please provide readers with an in-depth view of your evaluation model so they can understand the variables and your weighting. Readers will then have the important opportunity to tailor your model to meet their our own needs and preferences. That would be a great service. In contrast, NW did this only at a macro level - which is meaningless.

I hope you have already sought beneficial input from vendors and respected security professionals BEFORE you defined your test plan. Knowing this and who they are can only increase the value of what you are doing and the credibility of the NC results.

(Added after writing the post, NAC Product Testing. Is there a better Way?) Your readers could learn a great deal more about individual products and AND ”products categories” AND their suitability for various situations if they could observe and participate in constructive discussions and debates about your tests and findings AFTER after you publish them. In this revised model for product evaluations, a forum where reviewers, vendors and your readers contribute their ideas becomes a major part of your product evaluation “service”. In one sense, NC becomes the instructor who successfully unleashes the incredible power of student knowledge. Yes, this would mean NC would need to rethink its product review model and create an effective new forum. But you can tap into key existing components: the latest web technologies, a huge pool of knowledgeable readers and their desire to be heard (questions and answers).

McAfee VirusScan Plus

McAfee VirusScan Plus is an ideal candidate for those seeking an antivirus/firewall combination without all the bloat of traditional Internet security suites. McAfee VirusScan Plus makes an easy job of removing adware and spyware, something not all antivirus products deliver. SiteAdvisor service is including, helping to guard against malicious websites.

Norton Internet Security 2009

Symantec's antivirus products have historically always provided excellent detection and removal of malware. On the downside, that protection came at the price of often crippling system performance. That's now a thing of the past. Performance overhauls are the hallmark of Norton Internet Security 2009, which features 'pulse updates' to deliver more frequent and thus smaller signature updates, whitelisting to streamline scan times, and a lighter, sleeker footprint that installs in mere minutes.

Unspam files lawsuit against unnamed cybercrooks

Anti-spam firm hopes to force banks to share more information on attacks.

Unspam Technologies, the company behind Project Honey Pot, has filed a lawsuit against unnamed 'John Does', who are thought to be responsible for stealing millions of dollars every month from US bank accounts through the use of malware. In 2007, Unspam filed a similar lawsuit against as-yet-unidentified spammers.
The purpose of this lawsuit, filed in a Federal District Court in Virginia, is to convince, or if necessary force banks to disclose information that will help unmask the identities of the crooks involved in cybercrime. Banks are known to be reluctant to share information about theft through phishing and malware, or even to admit that they have suffered from such theft, and it is believed that this works to the advantage of the criminals behind such attacks.

A second goal of the lawsuit is to find a 'chokepoint' in the systems used by banks that makes for easier abuse. Jon Praed, Unspam's attorney and an experienced anti-spam lawyer, said that one possibility would be that the information made available through the lawsuit will show that banks are only using single-factor authentication; in that case, it is hoped that the case would help strengthen the authentication processes used by the banking sector.

Thursday, May 28, 2009

Flexible and scalable solutions for professional video surveillance and remote monitoring

Axis provides a full range of network video solutions for a broad spectrum of industry segments and applications, as well as specific solutions for specific situations.
For example, for CCTV users faced with the technology shift from their existing analog systems to network video, Axis offers solutions for migrating and for expanding seamlessly. We make deployment easy in all types of environments: indoor, outdoor, wired, wireless, and in rough, tough conditions.

Open standards only - for full integration and video management capabilities
Axis network video solutions are based on Axis' VAPIX®, our own open, standard-setting API (application programming interface). This makes Axis network video solutions cost-efficient, flexible, scalable, future-proof and easy to integrate with other systems, such as access control and building management systems.

Fully qualified domain name

A fully qualified domain name (FQDN), sometimes referred to as an absolute domain name, is a domain name that specifies its exact location in the tree hierarchy of the Domain Name System (DNS). It specifies all domain levels, including the top-level domain, relative to the root domain. A fully qualified domain name is distinguished by its unambiguity; it can only be interpreted one way.


For example, given a device with a local hostname myhost and a parent domain name example.com, the fully qualified domain name is myhost.example.com. The FQDN therefore uniquely identifies the device — while there may be many hosts in the world called myhost, there can only be one myhost.example.com.


In the DNS, and most notably, in DNS zone files, a FQDN is specified with a trailing dot, for example, "somehost.example.com.". The trailing dot denotes the root domain. Most DNS resolvers will process a domain name that contains a dot as being an FQDN[1] or add the final dot needed for the root of the DNS tree. Resolvers will process a domain name without a dot as unqualified and automatically append the system's default domain name and the final dot.
Some applications, such as
web browsers will try to qualify the domain name part of a Uniform Resource Locator (URL) if the resolver cannot find the specified domain. Some applications, however, never use trailing dots to indicate absoluteness, because the underlying protocols require the use of FQDNs, such as e-mail

Domain name

A domain name is an identification label to define realms of administrative autonomy, authority, or control in the Internet, based on the Domain Name System (DNS).

Domain names are used in various networking contexts and application-specific naming and addressing purposes. A prominent example are the top-level Internet domains com, net and org.

Below these top-level domains (TLDs) in the DNS hierarchy are the second-level and third-level domain names that are open for reservation and registration by end-users that wish to connect local area networks to the Internet, run web sites, or create other publicly accessible Internet resources. The registration of these domain names is usually administered by domain name registrars who sell their services to the public.

Individual Internet host computers use domain names as host identifiers, or hostnames. Hostnames are the leaf labels in the domain name system usually without further subordinate domain name space. Hostnames appear as a component in Uniform Resource Locators (URLs) for Internet resources such as web sites (e.g., en.wikipedia.org).

Domain names are also used as simple identification labels to indicate ownership or control of a resource. Such examples are the realm identifiers used in the Session Initiation Protocol (SIP), the DomainKeys used to verify DNS domains in e-mail systems, and in many other Uniform Resource Identifiers (URIs).

An important purpose of domain names is to provide easily recognizable and memorizable names to numerically addressed Internet resources. This abstraction allows any resource (e.g., website) to be moved to a different physical location in the address topology of the network, globally or locally in an intranet. Such a move usually requires changing the IP address of a resource and the corresponding translation of this IP address to and from its domain name.

This article primarily discusses the registered domain names, the domain names registered by domain name registrars to the public. The Domain Name System article discusses the technical facilities and infrastructure of the domain name space and the hostname article deals with specific information about the use of domain names as identifiers of network hosts.

Thursday, May 21, 2009

Antivirus Software

Antivirus software mainly prevent and remove computer viruses, including worms and trojan horses. Such programs may also detect and remove adware, spyware, and other forms of malware.

A variety of strategies are typically employed. Signatures involve searching for known malicious patterns in executable code. However, signatures can only be updated as viruses are created; users can be infected in the time it takes to create and distribute a signature. To counter such zero-day viruses, heuristics may be used to essentially guess if the file is truly malicious. Generic signatures look for known malicious code and use wild cards to identify variants of a single virus. An antivirus may also emulate a program in a sandbox, monitoring for malicious behavior. Success depends on striking a balance between false positive and false negatives. False positives can be as destructive as false negatives. In one case a faulty virus signature issued by Symantec mistakenly removed essential operating system files, leaving thousands of PCs unable to boot.

Antivirus software can have drawbacks. If it is of the type that scans continuously, antivirus software may cause a significant decline in computer performance, it may present computer users with a decision the user may not understand. Antivirus software generally works at the highly trusted kernel level of the operating system, creating a potential avenue of attack
The effectiveness of antivirus software is a contentious issue.
One study found that the detection success of major antivirus software dropped over a one-year period

Monday, May 18, 2009

Server Monitoring Software

Scalable, Efficient Monitoring
Large sites monitor more than 1000 servers/devices from a single PA Server Monitor server.

Flexible Monitoring
Monitor dependencies, maintenance schedules, time of day rules

Powerful Alerts
Per monitor event escalation rules, alert suppresion, SMS, email and more

Ease of Use
Simple Startup Wizard - be monitoring 5 minutes from now. Bulk Config to quickly make huge changes.

Reports
Automatic server and server-group reports, ad-hoc reports, detailed scheduled reports

You can easily monitor...

Event logs , CPU usage, Memory usage, NIC usage , Free disk space , Running services
Log files , Server & room temperature , SNMP object values , Running process(es)
Directory quotas , Changed files and directories, Performance counter values
POP, IMAP and SMTP mail servers , Web page content and load times
Ping response times , TCP port response , Citrix Monitoring , Additional resources via user scripts

PA Server Monitor is the most powerful monitoring solution in its class.
Scalable, Efficient Monitoring
Large sites monitor more than 1000 servers/devices from a single PA Server Monitor server.
Flexible Monitoring
Monitor dependencies, maintenance schedules, time of day rules
Powerful Alerts
Per monitor event escalation rules, alert suppresion, SMS, email and more
Ease of Use
Simple Startup Wizard - be monitoring 5 minutes from now. Bulk Config to quickly make huge changes.
Reports
Automatic server and server-group reports, ad-hoc reports, detailed scheduled reports

Friday, April 24, 2009

Understanding Viruses

Anyone with even a small amount of computer experience has heard of computer viruses. Even with no knowledge of how a virus functions, the word strikes fear into the heart of computer owners. What exactly is a virus and how does it function?
What is a Virus?
A virus is a computer program designed to enter your computer and tamper with your files without your knowledge. Once the program containing the virus is open, the activated virus can not only infect other programs and documents on your computer, it can duplicate and transmit itself to other computers that are connected to yours, just like a physical virus can move from one human host to another.

Viruses began in the late 1980s as personal computers and electronic bulletin boards became more common. Back then, operating systems, word processing programs and other programs were stored on floppy disks. Hidden viruses were programmed onto these disks; as the disks were transferred from person to person, the virus spread.
Who Creates Viruses?
Where do viruses come from? Every virus is created by an author with a different motive—but all virus builders feel their actions are justified. For some, a killer virus is the ultimate technical challenge, like climbing a mountain. For others, creating viruses is a form of self-expression. Some disgruntled employees, consumers or citizens turn to virus building as revenge for perceived injustices. And though it’s a frightening thought, some viruses are built and aimed by legitimate (but disreputable) businesses to weaken competitors. Other virus authors want to make their mark in Internet history; these writers get a thrill out of seeing their virus cause damage enough to attract news headlines both online and on the evening news.
What Do Viruses Do?
Today’s viruses are far more potent than the beginner versions we saw several decades ago. Viruses may be sent by opening email attachments, clicking on spam, visiting corrupt websites and links online, opening spreadsheets or even the original method—infected disks. But the Internet is now the superhighway for virus transmission.
Some aggressive viruses—such as the Melissa virus—automatically duplicate copies of itself to the first 50 people in your computer email address book. A frightening prospect—opening an email from someone you trust to be greeted by a virus, and that’s exactly what the author is counting on, your trust.

The damage caused by these viruses varies from minor delays in computer function to complete destruction of your hard drive. For companies, the price is far higher. A downed website can cost a company millions of dollars a day.
How does the virus infect your computer? Because floppy use is nearly extinct and the majority of CDs that change hands cannot be altered, you will most likely bump into a virus through online activity.

Some viruses attack your boot sector, the start up section of your hard drive. Other viruses infect executable program files, activating each time the program is started. The virus travels into the memory and further copies itself.
Macro-viruses are the most common type of computer virus. This type of virus attacks data files containing macros. Macros are lists of commands or actions found under key headings. The virus resembles a macro but when the file is opened, the virus is activated.
Multi-partite viruses are a combination of the boot sector and file virus. These begin in the boot sector and affect both your boot records and program files.
Is My Computer Infected?
How can you tell if your computer has a virus? There are warning signs that your computer may be infected with a virus. For minor viruses, you may encounter strange messages, images, noises or music on your computer. An infected computer may have less memory available, or you may notice name changes. A computer infected with a virus may be missing programs, or files may malfunction. If you encounter any of these characteristics on your computer, you are most likely experiencing an attack from a virus.

Is there any hope? How can you protect your computer from viruses? If you do not have any virus software on your computer now, consider installing some soon. Be sure to update your anti-virus software regularly; this way you’ll be protected from new viruses that crop up.
Use your software to scan for viruses weekly. Don’t open emails from unknown sources, and be cautious when opening attachments—even attachments from people you trust. Hyper vigilance requires you to contact the sender and confirm the attachment before you open it, but this is too much. Just be aware. It Aunt Gertrude typically includes a newsly, well-written letter with the jokes she sends, and this attachment email from her comes with: “Open this now, baby!” alarm bells should go off. Don’t open it.

Constantly back up your data in case a virus attacks your hard drive and you need to reformat. Better yet, set up your computer to automatically back up your data weekly so that you don’t have to worry about this chore.
What Should I do if I have a Virus?
What do I do if I find out that I have a virus on my computer? Know that it’s not the end of the world. As a courtesy, contact everyone (by phone, preferably) you have been connected by email to warn them possible exposure to the virus right away.
Clean your computer with anti-virus software. If your computer is still not functioning and you have data you are concerned about recovering, consider hiring a trusted expert. Often data can be successfully extracted from an injured hard drive, but the process is complex and will involve another computer, special software, and a technician with a lot of experience in data recovery.
As a last resort, reformat your hard drive, even if it means destroying all of the information located there. Reinstall the software and data using your backup files.

Thursday, April 23, 2009

Why Choose Cisco

The Value of a Systems Approach

A systems approach begins with a single, resilient platform such as the Cisco integrated services routers. A systems approach combines packaging with intelligent services within and between services, and weaves voice, security, routing, and application services together, so that processes become more automated and more intelligent. The results are pervasive security in the network and applications; higher QoS for data, voice, and video traffic; increased time to productivity; and better use of network resources.

With the integrated services router, Cisco offers a comprehensive, future-proofed solution that minimizes network outages and ensures access to the most business-critical applications. Cisco's focus on integrating new infrastructure services with performance enables companies to create networks that are more intelligent, resilient, and reliable. For organizations of all sizes that need fast, secure access to today's mission-critical applications as well as a foundation for future growth, Cisco routers:

Provide the industry’s first portfolio engineered for secure, wire-speed delivery of concurrent data, voice, and video services Embed security and voice services into a single routing system
Use an integrated systems approach to embedded services that speeds application deployment and reduces operating costs and complexity Provide unparalleled services performance and investment protection Unlike specialized niche products, Cisco Integrated Services Routers embed security and voice services as a single resilient system for ease of deployment, simplified management, and lower operating costs. Cisco routers provide the secure communications solutions you need today, while laying the foundation for tomorrow's Intelligent Information Networks.

In addition, Cisco Integrated Services Routers:
Provide fast, secure access to mission-critical business applications and unmatched investment protection for future growth, enabling organizations to easily deploy and manage converged communications solutions with end-to-end security for maximum end user productivity
Feature industry-leading services densities, bandwidth, availability, and performance options for maximum configuration flexibility and scalability for the most demanding networking environments Provide a broad range of voice densities and services, allowing customers to easily enable end-to-end, best-in-class IP Communications solutions, while providing a foundation for future growth and investment protection Are the only routers that allow organizations to build a foundation for an intelligent, self-defending network, featuring best-in-class security services and routing technologies for the lowest total cost of ownership and highest return on investment.

Monday, April 13, 2009

VOIP Solutions

VOIP has evolved greatly in the recent years and it has advanced the internet based telephony system. It assures revenue saving for the organizations that require internet telephony solutions. VOIP stands for voice over IP and it is used for making the phone calls over the internet. It converts the voice signals into the digital packets. VOIP technology supports the two way conversation by using the voice over internet protocol. Many businesses around the world have been implementing VOIP solutions due to its great features and low price.

This technology provides the scalable, mobile, reliable and secure voice communication solutions to the small, medium and large enterprises around the world. The calls over the internet can be made by using the services of the VOIP provider, computer’s audio system and headphone. The other methods of placing the calls are by using the VOIP telephone and by using the normal telephone with the VOIP adapter. The users can make the calls in very easy way. This technology offers great saving over the conventional phone calls. Due to the recent advantages of the voice over IP technology like low cost and increased reliability has made it the best voice solution over the internet.

Businesses and call centers around the world require more bandwidth, scalability and secure infrastructure for the implementation of the VOIP solutions. Some broadband service providers have less bandwidth than required. Before getting the VOIP services from the service providers, heck its features and make sure whether they suite your business requirements or not. Sometimes due to the low bandwidth, network latency, packet loss and jitter the users get the unsatisfactory voice quality.

VOIP Features

Voice over IP technology provides the following features for the businesses.

Voice Mail
Call Waiting
3 Way calling
Caller ID with name
Enhanced voice mail
Anonymous call blocking
Speed dialing
Call transfer
Phone to phone option
Repeat dial
Call logs
Conference calling
Customized ringing tones
SMS delivery solutions
Call back facility
Secure calls by using the standardized protocols.

Web Security Software

The web is evolving rapidly and become more dynamic with the use of the web 2.0 applications. But it has also been used to launch the various attacks. There are many threats browse the internet, check emails, transfers files and shop online. Every online computer can be infected with the viruses or spyware in only 20 minutes if the proper security solutions have not been implemented. The well known threats include viruses, spyware, adware, phishing, Trojan horses, malware, web worms, intruders and the cyber villains (hackers). If you computer is infected then all other connected computers can get infected automatically.

There are many internet bugs that can corrupt your data, degrade the performance of your system, steal your critical data and crash your system. There are many internet filtering, antivirus, anti spyware, anti-phishing and security solutions available. With the implementations of the web security solutions, you can protect your entire network from being attacked from the viruses, hackers and other online threats.

Some of the common techniques to deal with the online threats are to install an up-to-dated antivirus software, install and configure firewall, update your operating system, raise the security of your browse, never open any email attachment from the unknown source, never give away your passwords, credit card number, SSN and bank account’s details through the emails. Some of the web security solutions include the following.

Norton Antivirus


Norton Antivirus provides the complete solutions against the viruses, spyware and adware. It helps to protect your computer from the online threats. It works in the background so that you can surf the net, play games and download the software or music. Norton Antivirus automatically checks for the updates and blocks the emerging threats. There are a wide variety of the security products and you can pick the best product according to your requirements.

McAfee Antivirus Software

McAfee is a leader of the security risk management, intrusion detection and prevention. McAfee Internet Security Suite provides you the protection against the viruses, spyware, adware, Trojan horses and hackers. It helps to detect the thousands of the viruses on your computer.

GFI Web Monitor

GFI Web Monitoring software protect your computer from downloading the dangerous items from the internet, reduce cyberslacking, prevent data leakage and lower the risks of the social engineering and the phising websites and emails.

Zone Alarm Internet Security Suite

ZoneAlarm Internet Security Suite 7 provides the anti spam, firewall and antivirus capabilities and protect your computer from the well known internet threats.

Networking Tips

Computer networks are used to share the data and resources and for the communications. To get the optimized performance, data protection, maintenance, improved reliability and the security, every system administrator and network administrator should know the basic maintenance, troubleshooting and security techniques. Downtime is very dangerous for the critical network business applications and servers. In this article, you will learn some of the best networking tips and by using them you can get the optimized performance from your network.

Security
A compute network is susceptible to the internet and external security related threats, which includes viruses, spyware, adware, Trojan horses, rootkits, web worms, intruders and hackers. To keep your network secure

Firewall:
Install and configure a software/hardware firewall on your gateway and all other computers in your network. Firewall is used monitor the inbound and outbound traffic and block the unauthorized access and hackers’ attacks.
Antivirus: Install antivirus software such as Norton Antivirus, Trend Micro Office Scan, Panda Antivirus or McAfee and regularly scan your computer with an antivirus program.
Anti spyware: Install and configure an up-to-dated anti spyware software in your network.
Updated Operating System: Update your Windows based operating systems with the latest service packs, hot fixes and security patches.
Browser Security: Raise the level of security of your web browsers.

Connectivity

Computer networking sometimes considered to be complex and seems to hard to troubleshoot. The connectivity problems occur in the computer network due to the devices conflicts, outdated LAN card’s driver, faulty hardware, faulty cable or connectors and misconfigurations. To troubleshoot the connectivity related issues, you need to perform the following tasks.

Check the LEDs of your LAN card.
Update the driver of your LAN card.
Scan your computer for the viruses and spyware.
Check the UTP/STP cable, the both end of the cable should be properly inserted i.e. one end in the LAN card and one end in the hub/switch or router.
Check the configurations of the LAN card.
PING the destination computer and check the status.
If your problem is still not resolved, replace the LAN card and reconfigure it.
Maintenance
Computer network availability and security is very critical for the businesses. Maintenance include the domain setup, dealing with the internal and external security threats, assigning IP addresses to the computes, enabling/disabling network services like DHCP, FTP SMTP, SNMP, taking data backup, adding/removing users, troubleshooting the software/hardware, configuring the firewall and implementing the security in the overall IT infrastructure. To perform the maintenance related tasks in your compute network, you need the perfect tools.

Troubleshooting
You can troubleshoot the computer network related problems by using the right tools and techniques. Be default, Windows based operating systems offer the TCP/IP stack, which contains the troubleshooting and diagnostic utilities such as PING, IPCONFIG, Hostname, ARP, Telnet, NSLOOKUP, Tracert and many others. Pinging a network computer is the first troubleshooting step as it checks the connectivity with the destination computer. Additionally, you can use the other troubleshooting tools such as Ethereal, IP Sniffer, LanGuard, Packeteer and many others. These tools help to diagnose the cause of the problem and troubleshoot them.

Performance
To get the optimized performance from your computer network, you need to perform the following actions on every computer of your network.

Use System Tools
Delete Unnecessary Files
Update Device Drivers
Update BIOS
Uninstall Unused Programs
Update Operating System
Wireless Networking Security Tips
The following tips are very helpful in securing your wireless computer network.

Change the Default SSID
Change the Default Administrator’s password
Disable SSID broadcast.
Enable Mac Address Filtering
Assign Static IP address to the Network devices and computers.
Turn on and configure the firewall on every computer in your network.
Enable IPSec, SSL, Encryption, WPA and WPE according to your security requirements.

Saturday, April 4, 2009

McAfee Vs. Norton


Protecting your PC is a necessity for anyone connecting to the internet. And these two releases from Norton and McAfee provide essential one-stop solutions. Each offers the core requirements and neither will be a poor choice, but your own requirements could make either one slightly more desirable.

A big complaint about previous versions of Norton was that it could be overwhelming. The updated Protection Center offers a simplified interface, with main features accessible from the front page and configuration options tucked away. McAfee’s control panel is not quite as slick, but experienced users will find it easier to locate settings they wish to modify, with essentials such as scanning and updates accessible via convenient buttons. And log files are more advanced, too.

Virus extinguishing
The core features of any internet security package remain antivirus protection and a firewall. Both products' firewalls protect users against harmful traffic. The enhanced Smart Firewall in Norton deals with a common problem for inexperienced users: how to deal with information about applications. Often files have obscure names, some of which may be necessary for the operation of essential applications.

For those who wish to have more control over their firewall settings, there are plenty of options to set up active, trusted and restricted networks. And an important feature of both applications is that they will monitor wireless networks. You can customise a list of trusted programs that are allowed to access the net.

In our advance review copy of McAfee, the firewall has to be downloaded once the suite has installed. This makes the installation more complex than necessary, but it should be rectified by the final release. As with the Norton firewall, however, it does a very good job of hiding your PC from malicious attacks. And it offers customisable control over which applications can access the web.

There tend to be more alerts from McAfee, while Norton’s Smart Firewall makes more decisions on behalf of the user. That said, McAfee's ability to police networks is simpler.

With regards to antivirus protection, maintaining up-to-date databases against potential threats is essential. McAfee and Symantec have long been major players in this field and updates are maintained extensively on a daily basis. The latest version of Norton includes an enhanced Auto-Protect component for viruses, spyware and adware, but offers IM (instant messaging) scanning and email protection too.

Enhancements have been made to such things as identifying rootkits and scan times. But despite running a system scan several times, Norton insisted that this action must be taken every time it is launched.

Norton's Bloodhound heuristics analyse executable files to find potential virus threats, even if these are not matched against any database. While the antivirus features are particularly good at locating known spyware and viruses, we didn't have the opportunity to test the effectiveness of Bloodhound itself.

McAfee VirusScan made effective work of scanning for viruses, spyware and other malicious activity on our test PC, although this was considerably slower than Norton and placed a greater burden on system resources. Like Norton, McAfee provides a heuristic engine (called SystemGuards) to monitor suspicious activity and prevent viruses that are not listed. And it uses a system called X-Ray to find and remove rootkits.

Baiting the line
While malware is the most significant component for online protection, there are plenty of other menaces – some of which can be even worse. If your privacy is infringed, or you fall prey to a phishing scam, it can be much more than just your hard drive that suffers.

As such, protecting personal information is a key feature for both of these security programs.

Norton has several features to protect users from fraudulent sites. The most obvious is the Toolbar, which is displayed by default in your browser. Usually this appears as a large green button at the top of the browser, indicating that fraud monitoring is on. But if you encounter a web page masquerading as another, Norton prevents the page from being displayed. This does involve a slight performance hit, but is extremely useful against links in scam emails.

Norton can use advanced heuristics to check that sites are what they claim to be. By breaking down URLs and analysing the format and content of web pages, it can have a decent stab at telling whether a website is official or not.

McAfee’s protection features are, as far as the browser is concerned, less ostentatious. A SiteAdvisor button sits in the toolbar, informing you whether sites are safe or not. When confidential information is sent out, the privacy service can block the information and alert the user. This can easily double up as a parental control, an area where McAfee is particularly strong. The program can analyse web pages for inappropriate content and images and then block any offending material.

The improvements to Norton Internet Security have been made primarily in terms of firewall and antivirus protection. In this version, you won't find advanced parental controls, for example. Nonetheless, for the security-obsessed, there are very good logging features and a comprehensive virus encyclopedia.

By contrast, McAfee's suite provides several extras. As well as the parental controls already alluded to, Spamkiller will block unwanted email, while Data Backup allows you to protect your data in case files are lost or damaged. After the initial run, McAfee can back up files the second that they're updated. As well as ensuring maximum security of files, this cuts down on those annoying moments when the PC embarks on a 20-minute backup job, just as you're in the middle of an important assignment.

And if you fancy getting hold of even more features, there is a Total Protection suite that includes password protection for crucial data and file shredding. You will, however, have to pay more for it.

Cisco Router Technology Overview

Network Router: An Integrated Services Approach

The network router is quickly evolving from a device dedicated to connecting disparate networks to an integrated services device capable of multiple functions beyond routing. Cisco customers are increasingly deploying integrated services routers, or sophisticated network routers that can deliver voice, video, data and Internet access, wireless, and other applications.

Benefits of the Integrated Services Network Router
Growing companies, especially those opening new offices, can take advantage of integrated network router solutions that are highly secure, flexible, and built to be compatible with future technologies.

One Device, Multiple Functions: An integrated services network router enables organizations to take advantage of numerous built-in technologies such as voice, wireless, and advanced security systems while ensuring the quality of service (QoS) prioritization their network applications demand. Because the network services are built in or can be easily added to the integrated services network router, companies can install one sophisticated device rather than purchase separate products to provide each individual function.

Same Access at Headquarters and Remote Sites: An integrated services network router gives all workers—even those at branch offices or remote sites such as a home or hotel room—the same access to business applications, unified communications, and videoconferencing. Modular solutions allow you to install the features you need for a particular office, and upgrade equipment when needs change or an office expands.



Centralized Management: An integrated services network router approach means technical staff at headquarters can manage the network from a central location. This allows technical departments to allocate resources to priority projects while providing reliable service to employees in all locations.

Integrated Network Security: An integrated services network router, with its systems approach, allows companies to transfer responsibility for security and reliability from individual computers and users to the network itself. This helps protect companies from the influx of viruses, malicious code, and other infections that end users’ laptops might unknowingly acquire.
By installing a complete solution and managing it centrally, companies can protect valuable corporate data using multiple types of protection, such as encryption, firewall filtering, antivirus protection, and intrusion detection and prevention.

Reap the Rewards of a Network Router
From reduced capital and operating expenses to increased productivity, the advantages of integrated services apply regardless of the size of company. A small company with two offices can benefit as much as a company with hundreds of branch offices.

What's the difference between a Hub, a Switch and a Router?

In a word: intelligence.

Hubs, switches, and routers are all devices that let you connect one or more computers to other computers, networked devices, or to other networks. Each has two or more connectors called ports into which you plug in the cables to make the connection. Varying degrees of magic happen inside the device, and therein lies the difference. I often see the terms misused so let's clarify what each one really means.

A hub is typically the least expensive, least intelligent, and least complicated of the three. Its job is very simple: anything that comes in one port is sent out to the others. That's it. Every computer connected to the hub "sees" everything that every other computer on the hub sees. The hub itself is blissfully ignorant of the data being transmitted. For years, simple hubs have been quick and easy ways to connect computers in small networks.

A switch does essentially what a hub does but more efficiently. By paying attention to the traffic that comes across it, it can "learn" where particular addresses are. For example, if it sees traffic from machine A coming in on port 2, it now knows that machine A is connected to that port and that traffic to machine A needs to only be sent to that port and not any of the others. The net result of using a switch over a hub is that most of the network traffic only goes where it needs to rather than to every port. On busy networks this can make the network significantly faster.

"Varying degrees of magic
happen inside the device,
and therein lies the difference."


A router is the smartest and most complicated of the bunch. Routers come in all shapes and sizes from the small four-port broadband routers that are very popular right now to the large industrial strength devices that drive the internet itself. A simple way to think of a router is as a computer that can be programmed to understand, possibly manipulate, and route the data its being asked to handle. For example, broadband routers include the ability to "hide" computers behind a type of firewall which involves slightly modifying the packets of network traffic as they traverse the device. All routers include some kind of user interface for configuring how the router will treat traffic. The really large routers include the equivalent of a full-blown programming language to describe how they should operate as well as the ability to communicate with other routers to describe or determine the best way to get network traffic from point A to point B.

A quick note on one other thing that you'll often see mentioned with these devices and that's network speed. Most devices now are capable of both 10mps (10 mega-bits, or million bits, per second) as well as 100mbs and will automatically detect the speed. If the device is labeled with only one speed then it will only be able to communicate with devices that also support that speed. 1000mbs or "gigabit" devices are starting to slowly become more common as well. Similarly many devices now also include 802.11b or 802.11g wireless transmitters that simply act like additional ports to the device.

Fiber Optic Cable

20.1 Multimode (MM) Fiber
Step index or graded index fiber. In North America the most common
size is 62.5/125; in Europe, 50/125 is often used. These numbers
represent the diameter of the core (62.5) and diameter of the
cladding (125) in microns. Multimode fiber is typically used in
applications such as local area networks, at distances less than 2 km.

20.2 Single Mode (SM) Fiber
Single mode fiber has a very small core. Typical values are
5-10 microns. Single mode fiber has a much higher capacity and
allows longer distances than multimode fiber. Typically used
for wide area networks such as telephone company switch to switch
connections and cable TV (CATV).

20.3 Loose Buffer
The fiber is contained in a plastic tube for protection.
To give better waterproofing protection to the fiber, the space
between the tubes is sometimes gel-filled. Typical applications
are outside installations. One drawback of loose buffer construction
is a larger bending radius. Gel-filled cable requires the installer
to spend time cleaning and drying the individual cables, and
cleaning up the site afterwards.

20.4 Tight Buffer
Buffer layers of plastic and yarn material are applied over the fiber.
Results in a smaller cable diameter with a smaller bending radius.
Typical applications are patch cords and local area network connections.
At least one mfr. produces this type of cable for inside/outside use.

20.5 Ribbon Cable
Typically 12 coated fibers are bonded together to form a
ribbon. There are higher density ribbons (x100) which have
the advantage of being mass-terminated into array connectors.
A disadvantage is that they are often harder, and require special
tools to terminate and splice.

20.6 Fiber Connectors
There are a lot of different types of connectors, but the ones
commonly found in LAN/MAN/WAN installations are:

FSD - Fixed Shroud Device, such as the FDDI MIC dual-fiber connector.
SC - A push-pull connector. The international standard.
The SC connectors are recommended in SP-2840A. The SC
connector has the advantage (over ST) of being duplexed
into a single connector clip with both transmit/receive fibers.
SMA - Threaded connector, not much used anymore because of losses
that change with each disconnection and reconnection.
ST - Keyed, bayonet-style connector, very commonly used.


20.7 Fiber Optic Test Equipment
Continuity tester: used to identify a fiber, and detect a break.
One type resembles a f/o connector attached to a flashlight.

Fault locator: used to determine exact location of a break.
Works by shining a very bright visible light into the strand.
At the break, this light is visible through the cable jacket.

Tone Generator and Tracer: used to identify a cable midspan or
to locate a strand at its far end. Similar in purpose to the
tone testers used on copper cable. The tone generator imposes
a steady or warbling audio tone on light passing down the cable.
The tracer detects and recovers the tone from light lost through
the cable jacket as a result of bending the cable slightly.

Optical Source and Power Meter: used to measure the end-to-end
loss through a f/o strand, or system of cable, connectors and
patch cables. Measurements are more accurate than an OTDR.

Optical Time Domain Reflectometer (OTDR): used to measure the length
of a cable, and detect any flaws in it. Can also be used to measure
end-to-end loss, although less accurately than a power meter.

Fiber Talk set: allows using a pair of f/o strands as a telephone line.

How To Add / Set Up A Microsoft Access Database

The Database Manager section of the Hosting Control Panel provides you with the ability to create and manage your own data source names for your databases.


To set up a Microsoft Access Database:


1. Access the Hosting Control Panel

2. Click on the Database Manager button then click on MS Access Database

3. Click on the Add button in the upper right hand corner

4. Type a Data Source Name (DSN) in the Data Source Name text box – this is the database name you will use in your HTML code to reference the database

5. Type a database name in the Database Filename text box – the name of the Access database the DSN will be attached to; enter the entire filename, including the .mdb file extension

6. Type a username in the Username text box [optional]

7. Type a password in the Password text box and type it again in the Password Confirm text box [optional]

8. Click on the Ok button

To view the MS Access database information, select the radio button next to the Database Name, then click on the View button.

Note If you password protected your MS database, you must use the same password during the add process.

Deploying Microsoft Network Access Protection (NAP) with Aruba's Mobile Network Solutions

Introduction With an increasing trend of mobility, more and more companies outfit their employees with wireless mobile devices that leave the corporate network and attach to networks at homes, public wireless hotspots, hotels, and partner sites. When these devices return to the corporate network, any malicious software they may be carrying can be spread to other corporate systems. For this reason, ensuring that devices are properly protected from malicious software has become a key interest of IT departments Aruba Network’s user-centric architecture has comprehensive access control capabilities and is built on a standards-based architecture that can easily integrate 3rd party security vendors for functions such as endpoint compliance. Aruba has partnered with Microsoft® to support Network Access Protection (NAP) for mobile users. Network Access Protection for Windows VistaTM and Windows Server® “Longhorn” (now in beta) is a technology designed to prevent networked assets from connecting to, or communicating with, non-compliant clients. It enforces compliance to network access and health requirement policies by setting access rights based upon validated health state and by coordinating endpoint remediation services to ensure ongoing compliance.
NAP for Wireless LANs This article introduces the NAP solution within the scope of the 802.1x and 802.11i wireless security mechanisms. The full deployment document is attached.
A Simple NAP Architecture



Aruba and Microsoft Network Access Protection Architecture
Wireless Settings A general recommendation is to implement the highest level of encryption available, which, in the case of an 802.11 network, happens to be 802.11i followed by 802.1x. The SSID that is used to enable users to connect to the corporate network should support WPA2-AES / WPA-TKIP or dynamic WEP with 802.11i / 802.1x WiFi authentication methods. NAP operations The basic Microsoft NAP Solution can be illustrated by the diagram above.

The managed Microsoft clients tries to connect to the network, and is required to authenticate
The client provides its login credentials to the sever and during the login process the client’s NAP agent (system health agent), if enabled on the client, presents the client’s current health status (anti virus signatures, patch levels, firewall settings, applications etc)
The Aruba mobility controller forwards the authentication credentials and health state information using the RADIUS protocol to the Network Policy Server (a Microsoft RADIUS server). The NPS evaluates the client’s health status against a pre-defined set of policies.



Microsoft NPS validates the client’s credentials once received. If the client credentials do not match the entries in Active Directory, the authentication fails, a failed authentication message is passed to the Aruba controller, and the controller denies network access to the client.
If the authentication succeeds, but the client is not compliant with the predefined health requirement policy, Microsoft NPS sends limited network access configuration information to the Aruba mobility controller, which places the client in a “role” with restrictive firewall policies. The client has limited access to the network or any other clients, and is redirected to get updates from a remediation server. The client requests and receives the updates and starts over by reauthenticating.
If the client is compliant with the health requirement policy, it is granted access to the network according to its business needs; e.g. a sales user is granted access to sales servers while access to finance networks and servers is blocked.
Advantages of using Aruba The Aruba solution allows the network manager to further enhance the usability, scalability and manageability of this solution. By using the Aruba system’s ability to assign roles and policies to users based on their authentication state and the attributes returned, users can be dynamically classified into different user groups based on the authentication results.