Author Archives: glyncc

I have given this talk about slow machines a number of times and I have been asked to write down what people need to know. It’s taken time to put this all together and there are still some areas I would not recommend the “timid” user to attempt but here we go!


It’s not as fast as it used to be when I bought it!

One of the biggest computer annoyances is when your machine gets slow over time.
This is in two parts, firstly the reasons it’s slow and then what can you do about it?


Part 1 Why is my computer system slow?

When you first get a new computer and boot it up it works lightning fast. That’s because it doesn’t have anything on it and its the fastest its ever going to be. Then it starts slowing down this can start to happen years after you get a PC, but sometimes it happens in just a few short months.



Some computers are never going to stay fast, its the way they are designed and usually reflect the price you paid. The cheaper the machine the slower all the internal bits.



Now just a small technical bit, every computer has an FSB speed, you wont see it in the headline specs as only techies are interested. It’s the common denominator, its the speed that everything works at, forget processor speed this is the important one. If you have a slow FSB then you have a machine that will quickly start to slow down.



Now the other reasons, You update the computer with your data, you install software and the operating system gets updates, best of all it will slow down with age. Typically a five year old computer will be running around 20% slower due to the ageing of the components.



So there isn’t one single reason that pinpoints why this happens. Regardless of whether you have a PC or Mac, over time as you download files, install software, and surf the Internet, your computer gets bloated with files that hog system resources.



We have to face the fact that as time goes on, our computers will get slow. It’s a natural progression. The Internet and software capabilities evolve by the minute. These new innovations require more power and space to keep up with the pace. Sometimes it might not even be your fault that your once zippy computer is now crawling but its just a sign of the times.



In addition, there are many other things that contribute to a slowdown, these are




HD
As your hard disk fills up it takes longer to store and retrieve data. This is because of the design of the hard disk and its quality but its also because of the way data is stored. If you have a spinning hard drive, once they get older they simply start to slow down as they reach the end of life. Lower cost drives store and retrieve the information more slowly.



It’s important to note that all spinning hard drives will die eventually, it could be tomorrow but it could also happen 10 years from now. It’s just the nature of their design.



A hard disk doesn’t store data in a sequential order, it puts it where it can and then when you want it the computer looks at where all the bits are stored and reassembles it. The file that hold this is known as the FAT – File Allocation Table.




RAM
All programs are loaded in to an area to be worked on called the RAM, these come at varying speeds too but the more programs you have the more RAM you will use. If there isn’t enough the computer starts moving things around and uses the hard disk as extra space, this is really slow when this starts happening.




TSR, Leftover, prefetch, resmon
Some programs load, run and then finish, but these still hog some RAM, these are known as Terminate but Stay Resident programs. Prefetch is another area where Windows remembers what you have used, it holds this in an area called PREFETCH. If you want to see what’s going on just go to start and type in RESMON.




Malware Spyware Virii, Netbots
These are programs that someone else wants to run on your computer. This would be an article on it’s own but these are all the “nastiness” that slow down computers.


Anti-virus packages
Many programs come with an anti-virus package but people forget to remove the old one before the new one is installed. It’s like leaving your old car in your garage and then trying to drive your new car into the same place, not ideal.




Plug ins
Everyone wants you to install their toolbar or addin, don’t, and if you have then remove them. All these plug ins take up RAM and some of them divert your Internet searches so that other people can see what you are looking for and may copy your credit card details when you buy something online.




BloatWare
This is often the sort of free software that will sort out your registry and clean up everything, if its free its probably malware and how do you speed things up by adding more? It’s all unnecessary software. This will fill up your hard drive and RAM, causing you to run out of space at the price of speed.




Cookies
Cookies are generally good but like everything you can have too many. A cookie is a small piece of data that a web site you are visiting downloads to your machine.




Data/os corruption
A corruption is simple, its when there is extra, missing or jumbled words. “the quoik briwn FFox umps ova the lzay dgo” You know what I said but it took longer to read! Software and hard drive corruption are two reasons why your computer may slow down over time.



Corruption can be caused by a host of things but it’s mostly bugs in the operating system, corrupted RAM data, static electricity (from carpet or other fabrics), power surges, failing hardware, and normal decomposition with age.




Windows update missing & old drivers
Adding in updates makes Windows bigger but not making them could add to security problems. If you add new devices or throw some away you could have old or unused drivers.




Overheating
If you computer sits on the floor it is taking in cooling air with dust and fluff, this builds up and stops the air circulating properly. If your computer has an Intel processor then it will gradually slow down as it warms up. If it is an AMD then its just a fire hazard. AMD are used in low cost hardware.





Part 2 What can I do?
You can always wipe it and start again but assuming that’s not practical lets go through a step by step clean down. It will involve loading programs and going through lists.

First backup your computer. Don’t do anything unless you have, if you want to risk it, just stop and think that’s it has vanished into thin air and you have to start from scratch, now make a backup.



Step 1: Check for Malware – most anti-virus programs will try to protect you from getting a virus. But MalwareBytes is the most effective software for getting rid of them once you have them. For a belt and braces approach, I would recommend starting Windows in safe mode, then run MalwareBytes. To do this, switch on or restart your computer, then keep pressing F8 – this will then give you a list of options – choose Safe Mode with Networking. Then run MalwareBytes and restart your computer once its finished.



Step 2: Load and run Hijackthis. Once you have run this it will display a list in notepad. copy this using Ctrl-C and go to www.hijackthis.de click on the empty area and press Ctrl-V, then click on analyze. Have a look down the list, any red crosses “nasty” then call us as Malware bytes has missed something.



Step 3: Run an online virus check such as Trust HouseCall or Eset. Go to Google and type Housecall in the search box, make sure its a Trendmicro web site and then follow the instructions. For ESET go to http://www.eset.co.uk/Antivirus-Utilities/Online-Scanner and follow the instructions.



Step 4: If you are running anything apart from ESET anti virus them remove it and ask us about a trial ESET licence. if you have Norton or McAfee installed, then get rid of them – they will slow your computer down. Other programs are large and bloated and some don’t work. Only have ONE anti-virus program installed. Having 2 or more anti-virus programs installed will dramatically slow your computer down because they are competing with each other.



Step 5: Go to Start, Control Panel and find programs. Have a good look down the list, some programs you will know and some you will not. Uninstall those you know you don’t need including all the toolbars, especially Ask!, Google toolbars etc.

if you have installed programs they not only take up storage space but also increase the size of the registry. The registry is like an index which is scanned by the computer for program options. The bigger the index, the longer it takes to scan.



Step 6: Reboot your machine.



Step 7: Update windows – ensure your windows is as up to date as possible. This is mostly for security flaws that Microsoft has identified but also bug fixes etc. Windows has a “windows update” option but in my experience, its sometimes not up to date. So check here first http://update.microsoft.com/ – there may also be optional updates, for example the latest versions of Internet Explorer or Windows Media Player. They are optional but I would recommend installing these anyway.



Step 8: Delete temporary files, fix the registry, stop start-up programs – this might be a bit techy though, if you don’t know what a program does then have a Google for it – start up programs run when your computer starts and can take up valuable memory. Really this is best done by a technician.



Step 9: Defragment your hard drive. Imagine a cassette tape which your favourite songs. Now imagine you delete a couple of songs and want to add a new song – there isn’t enough room for the song in either of the deleted spaces but it can be split across them. Eventually, after deleting and adding new songs, the songs are all over the tape. This is called fragmentation. The hard disk in your computer works in the same way. But we can use a program to move the files around to make them more efficient – this is defragmentation!



If your computer is still slow after all the above, then you might need to increase the memory. Unfortunately there are many different types of memory.



If you want us to do all this then it will take between one to two hours, but we might just make your machine work for another year or two.

How scientific thinking is all about making connections

When it comes to the field of science, making connections between those dots of knowledge seems to be just as important. In The Art of Scientific Investigation, Cambridge University professor W. I. B. Beveridge wrote that successful scientists “have often been people with wide interests,” which led to their originality:

Originality often consists in linking up ideas whose connection was not previously suspected.

He also suggested that scientists should expand their reading outside of their own field, in order to add to their knowledge (so they would have more dots when it came time to connect them, later):

Most scientists consider that it is a more serious handicap to investigate a problem in ignorance of what is already known about it.

Lastly, science writer Dorian Sagan agrees that science is about connections:

Nature no more obeys the territorial divisions of scientific academic disciplines than do continents appear from space to be colored to reflect the national divisions of their human inhabitants. For me, the great scientific satoris, epiphanies, eurekas, and aha! moments are characterized by their ability to connect.

Start making connections and getting creative

I’ll leave you with some suggestions for improving your own ability to make connections.

1. Add to your knowledge – the power of brand new experiences

After all, the more knowledge you have, the more connections you can make. Start by reading more, reading more widely, and exploring new opportunities for gathering knowledge (for instance, try some new experiences—travel, go to meetups or take up a new hobby).
As researcher Dr.Duezel explained when it comes to experiencing new things:

“Only completely new things cause strong activity in the midbrain area.”

So trying something new and forcing a gentle brain overload can make a dramatic improvement for your brain activity.

2. Keep track of everything – especially in the shower

As Austin Kleon suggests, take a notebook (or your phone) with you everywhere and take notes. Don’t expect your brain to remember everything—give it a hand by noting down important concepts or ideas you come across. As you do this, you may remember previous notes that relate (hey, you’re making connections already!)—make a note of those as well.


3. Review your notes daily – the Benjamin Franklin method

Going over your notes often can help you to more easily recall them when you need to. Read through what you’ve made notes of before, and you might find that in the time that’s passed, you’ve added more knowledge to your repertoire that you can now connect to your old notes!
In fact, this used to be one of Benjamin Franklin’s best kept secrets. Every morning and every evening he would review his day answering 1 simple question:

“What good have I done today?”

Here is his original daily routine.

Intelligence and connections: why your brain needs to communicate well with itself


Research from the California Institute of Technology showed that intelligence is something found all across the brain, rather than in one specific region:

The researchers found that, rather than residing in a single structure, general intelligence is determined by a network of regions across both sides of the brain.


One of the researchers explained that the study showed the brain working as a distributed system:

“Several brain regions, and the connections between them, were what was most important to general intelligence,” explains Gläscher.


The study also supported an existing theory about intelligence that says general intelligence is based on the brain’s ability to pull together and integrate various kinds of processing, such as working memory.


At Washington University, a research study found that connectivity with a particular area of the prefrontal cortex has a correlation with a person’s general intelligence.


This study showed that intelligence relied partly on high functioning brain areas, and partly on their ability to communicate with other areas in the brain.


Aside from physical connectivity in the brain, being able to make connections between ideas and knowledge we hold in our memories can help us to think more creatively and produce higher quality work.


Connections fuel creativity: nothing is original

stevejobsSteve Jobs is an obvious person to reference whenever you’re talking about creativity or innovation, so I wasn’t surprised to find that he has spoken about making connections before. This great quote is from a Wired interview in 1996:

Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it, they just saw something.

Jobs went on to explain that experience (as we saw in the image at the top of this post) is the secret to being able to make connections so readily:

That’s because they were able to connect experiences they’ve had and synthesize new things. And the reason they were able to do that was that they’ve had more experiences or they have thought more about their experiences than other people.

Maria Popova is arguably one of the best examples (and proponents) of what she calls “combinatorial creativity.” That is, connecting things to create new ideas:
in order for us to truly create and contribute to the world, we have to be able to connect countless dots, to cross-pollinate ideas from a wealth of disciplines, to combine and recombine these pieces and build new castles.
She’s given a talk on this at a Creative Mornings event before, and made some great points. Being able to read about a wide range of topics is often one of of the most important elements. I really liked how she pointed out the way our egos affect our willingness to build on what others have done before:

… something we all understand on a deep intuitive level, but our creative egos sort of don’t really want to accept: And that is the idea that creativity is combinatorial, that nothing is entirely original, that everything builds on what came before…

My favorite part of this talk is Popova’s LEGO analogy, where she likens the dots of knowledge we have to LEGO building blocks:

The more of these building blocks we have, and the more diverse their shapes and colors, the more interesting our castles will become.

Author Austin Kleon is someone who immediately comes to mind whenever the topic of connections and remixing art comes up. Kleon is the author of Steal Like An Artist, a book about using the work of others to inspire and inform your own.
It starts off like this:
Every artist gets asked the question, “Where do you get your ideas?”
The honest artist answers, “I steal them.”

Kleon is inspiring because he’s so upfront about how the work of other people has become part of his own work. He’s also keen on the phrase I quoted from Maria Popova above, that “nothing is original”:

Every new idea is just a mashup or a remix of one or more previous ideas.

If you’re looking for advice on creating more connections between the knowledge you have (and collecting even more knowledge), Kleon’s book is a great place to start. He offers suggests like:

  • carry a notebook everywhere
  • read a lot
  • keep a scratch file

Which is better Knowledge or Experience?

Its a simple answer really, if you have knowledge it is of no use if you don’t have the experience to know what to apply, how and when.

Its a concept that I have understood but I have had great difficulty explaining it. When I talk publicly and use our full tagline “Because 30 years experience really counts” I get loads of nods of understanding but how do you explain the difference to make it really hit home. Well I have found an easy way, I would like to take the credit because its so blindingly obvious but I can’t.

There’s a key difference between knowledge and experience and it’s best described like this:

The original is from cartoonist Hugh MacLeod who came up with such a brilliant way to express a concept that’s often not that easy to grasp.



The image makes a clear point—that knowledge alone is not useful unless we can make connections between what we know. Whether you use the terms “knowledge” and “experience” to explain the difference or not, the concept itself is sound.
.



Oh dear the cloud chickens are coming home to roost, but what about all the small guys who are using it?


Or why I think there is a similarity between Cloud suppliers, PPI, Endowment and extended warranties. They are all Ponzi schemes designed  to make someone richer or better off but not you.

Enterprises want governance in the cloud in order to maintain control and assure stakeholders and regulators they can manage mission-critical systems that have grown increasingly complex and integrated.



This means having secure communications with trusted suppliers, and being able to sleep at night because you know your intellectual property is in a safe place.


In banking this means that monetary, reconciliations and reports for regulators are dependably processed by the cloud provider. In healthcare, governance means HIPAA compliance and the ability to secure and protect patient information.
But then there is the other end of the cloud spectrum; that presents cloud as a commodity service that can be subscribed to and de-subscribed at will, and that carries bargain-level pricing that make it difficult for business users to say “no” to the service.


The tradeoff in many cases is that you get the service inexpensively, but you shouldn’t necessarily depend upon it to meet your needs.


Examples of cloud services that are decidedly more casual in governance include those that crash or “go down” without much explanation. (Read: Microsoft’s Windows Azure Compute cloud suffers global crash, Google Drive Crashes for “Significant Subset”of Users, and Verizon Launches Broken, ‘Me Too’ Cloud Storage Platform) If these failures occurred on premise in enterprises, CIOs would face numerous questions from board members and stakeholders, and might even be in danger of losing their jobs.

Lax governance practices and a seemingly casual attitude toward public cloud providers have caused many enterprise CEOs to opt for their own private cloud solutions instead.



Nevertheless, there is also a strong enterprise argument for less expensive cloud solutions that are inexpensive because they don’t have to invest so much in governance. After all, don’t worldwide enterprises collectively have millions of business users who are already accustomed to routine crashes of their word processing software? To these users, the comfort level with the software and its relative inexpensiveness is enough to convince them to simply get a cup of coffee and wait while the system reboots.


This dichotomy on how business feels (or doesn’t feel) about governance presents an interesting question to cloud providers, which must now decide for the present and the foreseeable future what “kind” of providers that they want to be. Do they want to be the platinum-grade, full-strength enterprise solutions, with a price tag for services and diligence in governance that reflects the effort? Or, do they want to be more of the “discount store” variety of service that everybody can afford and get value from, at the sacrifice of enterprise-strength governance?


There’s room for both. And the sooner cloud providers decide which type of cloud provider they want to be, the sooner it gets easier for enterprises and SMBs to differentiate them from others and to understand the specifics of the value propositions they offer.


The reason for comparing Ponzi schemes to Cloud providers is simple. We are being offered allegedly 100% secure services which we can resell for a lot of money but the provider could be in a shed with a ADSL line. The client believe they are getting a cast iron guarantee of safety, we would make a fortune out of commission and the original supplier can supply a dream for peanuts. see the similarity?

Virtualisation – Doing More with more and paying less
– a beginners guide

The trend now is to do away with several servers and combine them in to one. It seems easy but many of the so called IT people don’t understand it so what chance do you? Well here is a simple guide. Ask your proposed installer and if they can’t make it this simple, don’t use them, use us instead!


A file server isn’t just a box that stores data for sharing, its much more than that and it can be even more than that too.



Let me explain all the tasks and Jobs that a server has to do:



Domain controller, it looks after all the computers and users, who does what, who can do what, who does what where and with what. It contains all the permissions, security rules and without it your network domain wont work. It holds all this in a thing called the Active Directory, like a phone book.



DHCP Controller When a computer, phone, tablet or anything joins the network it needs some information so it can connect with everything else. The DHCP controller does this and also limits the connection time.



File Server This is sharing the data files according to the rules of the Domain Controller and Active directory

That’s the simple stuff now comes the specialised heavyweight stuff:



SQL Server This is a special role for holding and handling data. Big fat files need special handling to keep them working quickly.



Exchange Server. This is the e-mail, diary and contacts server. Its what makes Outlook work in a network.



Remote Desktop Server aka Terminal Services. This takes some imagining. Imagine several desktop computers, now cram them into one box so that several people can connect remotely as if they were using a computer in the office.

In the past you needed three or four servers one for the simple stuff and one each for the heavier role. You could not put them all on one machine as they would fight with each other and the system would become unreliable.

Computers have come on a long way and are now much more powerful, a server still spends a lot of time doing nothing and a way has been developed to use this spare time without needing all the extra drives, power supplies, video cards etc. This is called virtaulization.

Imagine we take a big powerful computer which has 24Gb of memory and two network cards. Its going to be cheaper than four separate servers. Now divide it up so that there is the basic server which has 8Gb of memory reserved for it. Now take the rest of the memory plus a lump of hard disk, split it into two and treat these parts as servers, separate memory but sharing the same power supply, processor etc.

If a single computer costs for example £3,000 then you would need three or four at a cost of £12,000. By using Virtualization we can use a £6,000 computer to do all the tasks.

Virtualization saves money, energy, and space. After you’ve decided to go virtual, take steps to make implementation easier: Get to know some important terms about virtualization, types of virtualization, and leading companies and products in virtualization.


Then we have backup and Anti Virus control.

So the reasons for using Virtualization?

  • It saves money: Virtualization reduces the number of servers you have to run, which means savings on hardware costs and also on the total amount of energy needed to run hardware and provide cooling
  • It’s good for the environment: Virtualization is a green technology through and through. Energy savings brought on by widespread adoption of virtualization technologies would negate the need to build so many power plants and would thus conserve our earth’s energy resources. 
  • It reduces system administration work: With virtualization in place, system administrators would not have to support so many machines and could then move from fire fighting to more strategic administration tasks. 
  • It gets better use from hardware: Virtualization enables higher utilisation rates of hardware because each server supports enough virtual machines to increase its utilisation from the typical 15% to as much as 80%. 
  • Backing up is easier and each system doesn’t take from the other. They all have their separate resources rather than sharing them.


  • Licensing
    You need a base licence of Server with Hyper-V (The virtualiser)

    You need a Server Licence for each virtual machine. In our example two

    You need a licence for Exchange and SQL servers

    You need user licences (Cal’s for each concurrent user of each system)

    Why is Sage Line 50 so slow?

    I wrote this some time ago but after a conversation recently I thought I would resurrect it as it still seems to be very topical.


    Sage Line 50 is a nice program for small businesses but it has one major flaw – it has effectively killed off all the competition.  That’s not a bad thing because it means that it will go on for ever as a standard but it also means there is no incentive to make it any better.

    This means that Sage’s fatal flaw will probably never be fixed. It was designed to work on a single computer in a small office. The design was to store data on the local hard drive. The chosen format was sage’s own design of DTA files. I think that these. DTA files are pretty much unchanged since Graham Wylie’s original program was written for CP/M on an Amstrad PCW. When in a network it stores its data in disk files on a server and shared across a network..


    So why is Line 50 so slow? The problem with Sage’s strategy of storing data in shared files is that when you have multiple users the files are opened/locked/read/written by multiple users across a network at the same time. It stands to reason that on a non-trivial set of books this will involve a good number of files, some of them very large. Networks are comparatively slow compared to local disks, and certainly not reliable, so you’re bound to end up with locked file conflicts and would be lucky if data wasn’t corrupted from time to time. As the file gets bigger and the number of users grows, the problem gets worse exponentially


    Sage won’t admit it The standard Sage solution seems to be to tell people their hardware is inadequate. In a gross abuse of their consultancy position, some independent Sage vendors have been known to sell hapless users new high-powered servers, which does make the problem appear to go away. Until, of course, the file gets a bit bigger.

    Anyone who knows anything about networking will realise this straight away that this is a hopeless situation, but not those selling Sage – at least in public.
    In fact it’s in Sage’s interests to keep Line 50 running slower than a slug in treacle. Line 50 is the cheap end of the range – if it ran at a decent speed over a network, multi-user, people wouldn’t buy the expensive Line 200. The snag is that Line 50 is sold to small companies that do need more than one or two concurrent users and do have a significant number of transactions a day.


    There is continual talk that the newer versions will use a proper database, indeed in 2006 they announced a deal to work with mySQL. But the world has been waiting for the upgrade ever since. It’s always coming in “next year’s” release but “next year” never comes. The latest (as of December 2009) is that they’re ‘testing’ a database version with some customers and it might come out in version seventeen. (2014 update and it’s still not there yet). Its amazing that Sages other pet projects which they have bought in such as ACT! all use SQL.


    One Sage Solution Provider, realising that this system was always going to time-out in such circumstances, persuaded the MD of the company using it to generate all reports by sitting at the server console. To keep up the pretence this was a multi-user system, he even persuaded them to install it on a Windows Terminal Server machine so more than one person could use it by means of a remote session.


    If that weren’t bad enough, apparently it didn’t even work when sitting at the console, and they’ve advised the customers to get a faster router. I’m not kidding – this really did happen.
    The fact is that Sage Line 50 does not run well over a network due to a fundamental design flaw. It’s fine if it’s basically single-user on one machine.


    What can you do to fix it?
    If you accept that Sage Line 50 is fundamentally flawed when working over a network you’re not left with many options other than waiting for Sage to fix it. All you can do is throw hardware at it. But what hardware actually works?

    First the bad news – the difference in speed between a standard server and a turbo-nutter-spaceship model isn’t actually that great. If you’re lucky, on a straight run you might get a four-times improvement from a user’s perspective. The reason for spending lots of money on a server has little to do with the speed a user’s sees; it’s much more to do with the number of concurrent users.

    So, if you happen to have a really duff server and you throw lots of money at a new one you might see something that took a totally unacceptable 90 minutes now taking a totally unacceptable 20 minutes. If you spend a lot of money, and you’re lucky.

    The fact is that on analysing the server side of this equation I’ve yet to see the server itself struggling with CPU time, or running out of memory or any anything else to suggest that it’s the problem. With the most problematic client they started with a Dual Core processor and 4Gb of RAM – a reasonable specification for a few years back. At no time did I see issues to do with the memory size and the processor utilisation was only a few percent on one of the cores.

    I’d go as far as to say that the only reason for upgrading the server is to allow multiple users to access it on terminal server sessions, bypassing the network access to the Sage files completely. However, whilst this gives the fastest possible access to the data on the disk, it doesn’t overcome the architectural problems involved with sharing a disk file, so multiple users are going to have problems regardless. They’ll still clash, but when they’re not clashing it will be faster.

    But, assuming want to run Line 50 multi-user the way it was intended, installing the software on the client PCs, you’re going to have to look away from the server itself to find a solution.
    The next thing Sage will tell you is to upgrade to 1Gb Ethernet – it’s ten times faster than 100Mb, so you’ll get a 1000% performance boost. Yeah, right!

    It’s true that the network file access is the bottleneck, but it’s not the raw speed that matters.
    I’ll let you into a secret: not all network cards are the same.

    They might communicate at a line speed of 100Mb, but this does not mean that the computer can process data at that speed, and it does not mean it will pass through the switch at that speed. This is even more true at 1Gb.

    This week at I’ve been looking at some 10Gb network cards that really can do the job – communicate at full speed without dropping packets and pre-sort the data so a multi-CPU box could make sense of it. They cost  £500 each, they’re probably worth it from a performance point of view but you will need fast cable and fast switches too but more of switches later..

    Have you any idea what kind of network card came built in to the motherboard of your cheap-and-cheerful Dell? I thought not! But I bet it wasn’t the high-end type though.

    The next thing you’ve got to worry about is the cable. There’s no point looking at the wires themselves or what the LAN card says it’s doing. You’ll never know. Testing a cable has the right wires on the right pins is not going to tell you what it’s going to do when you put data down it at high speeds. Unless the cable’s perfect its going to pick up interference to some extent; most likely from the wire running right next to it. But you’ll never know how much this is affecting performance. The wonder of modern networking means that errors on the line are corrected automatically without worrying the user about it. If 50% of your data gets corrupted and needs re-transmission, by the time you’ve waited for the error to be detected, the replacement requested, the intervening data to be put on hold and so on your 100Mb line could easily be clogged with 90% junk – but the line speed will still be saying 100Mb with minimal utilisation.

    Testing network cables properly requires some really expensive equipment with wonderful names like time domain reflectometer, and the only way around it is to have the cabling installed by someone who really knows what they’re doing with high-frequency cable to reduce the likelihood of trouble. If you can, hire some proper test gear anyway. What you don’t want to do is let an electrician wire it up for you in a simplistic way. They all think they can, but believe me, they can’t.

    Next down the line is the network switch and this could be the biggest problem you’ve got. Switches sold to small business are designed to be ignored, and people ignore them. “Plug and Play”.

    You’d be forgiven for thinking that there wasn’t much to a switch, but in reality it’s got a critical job, which it may or may not do very well in all circumstances. When it receives a packet (sequence of data, a message from one PC to another) on one of its ports it has to decide which port to send it out of to reach its intended destination. If it receives multiple packets on multiple ports it has handle them all at once. Or one at a time. Or give up and ask most of the senders to try again later.

    What your switch is doing is probably a mystery, as most small businesses use unmanaged “intelligent” switches. A managed switch, on the other hand, lets you connect to it using a web browser and actually see what’s going on. You can also configure it to give more priority to certain ports, protect the network from “packet storms” caused by accident or malicious software and generally debug poorly performing networks. This isn’t intended to be a tutorial on managed switches; just take it from me that in the right hands they can be used to help the situation a lot.

    Unfortunately, managed switches cost a lot more than the standard variety. But they’re intended for the big boys to play with, and consequently they tend to switch more simultaneous packets and stand up to heavier loads.

    Several weeks back we upgraded the site with the most problems from good quality standard switches to some nice expensive managed ones, and guess what? It’s made a big difference. My idea was partly to use the switch to snoop on the traffic and figure out what was going on, but as expected  it also appears to have improved performance, and most importantly, reliability considerably too.

    If you’re going to try this, connect the server directly to the switch at 1Gb. It doesn’t appear to make a great deal of difference whether the client PCs are 100Mb or 1Gb, possibly due to the cheapo network interfaces they have, but if you have multiple clients connected to the switch at 100Mb they can all simultaneously access the server down the 1Gb pipe at full speed (to them).


    This is a long way from a solution, and it’s hardly been conclusively tested, but the extra reliability and resilience of the network has, at least allow a Sage system to run without crashing and corrupting data all the time.


    If you’re using reasonably okay workstations and a file server, my advice (at present) is to look at the switch first, before spending money on anything else.


    Any thing else?
    The other solutions are a). Invest in a good Unix server, its complicated but basically the server handles files faster and better than Microsoft OS’s that’s why all the big companies use it. b). Invest in a terminal server running remote desktop with the data held locally, this way all the processing is done on one machine – the server and then then results are sent out to the workstations

    ……………… or wait for Sage to fulfil their 2006 promise, its only been eight years and did I tell you it’s not in their interest?

    Congratulations, you made it to 2014 – It’s a good year to be tax efficient and start buying some things – NOT!
     

     

    If your car is newer than your computer equipment then you are probably making the same mistakes as RBS, Lloyds and 36% of the businesses who are using XP.
     
    Lets face it, business has not been good or moving in an upward direction since 2008. Yes things are more optimistic now but you may be one of the many businesses facing some problems, for some it is cash flow, for others its funding the extra business, lets face it there is now a danger of overtrading, where you don’t want to run out of liquidity. You also probably know deep down that you need new IT equipment, there are two temptations, buy the cheapest possible or soldier on until it breaks. OUCH! that’s expensive.
     
    Neither are real options, so you have decided to choose the right equipment from the right supplier but you didn’t realise how expensive computer equipment is, surprise, surprise, its gone up since 2008
     
    Sourcing funds to buy “stuff”
     
    You have a few options, use your cash or borrow it, certainly in an upturn you can make enough money to fund the borrowing but with the bank charging 6.5% for a low cost loan and credit cards still charging 20%+ it doesn’t make much sense, especially if you can only claim a small amount back against tax.
     
    There is a way that works, you can claim everything against tax and spread your cost over a few years. Because interest rates are low its a good time as the rate is fixed at the time you sign the deal you won’t get a shock when interest rates go up.
     
    Lets look at £10,000 of equipment, which is a mid range server, some PC’s installed with software.
     
    If you pay cash, you have taken that out of your cash flow. You may have to borrow the money and you can only claim back 20% back against tax. The borrowing rate may also change.
     
    To lease the equipment over four years it will cost you £70 a week. Over four years it works our around 4.4% interest per annum and you can offset all the payments against tax. There is a residual buy back agreement that allows you to buy the equipment at the end of the term, usually for one quarters payment.
     
    If you want to protect your cash flow, spread your costs, save money and have maximum tax efficiency then its the best way to have some “stuff”
     
    WE are we going on about it now, simple really, before 2008 it was cheap to borrow money and no one needed leasing as everyone was rolling in cash and life was good so we stopped offering leasing in 2000. It makes more sense now than ever, low interest rates and a need to keep cash flow going, especially for companies damaged in th recession it makes perfect sense for us to start offering leasing again.
     
    We are the intermediary, we don’t get commission or any benefits but we like to offer our clients something that the others don’t. Experience tells us what is the best choice for our clients.
     
     
     
     
     
     
     
     
     

    10 Tips to Make a Secure Password

    secure passwordHaving received a funny but rude post about passwords (email me at glyn@cmx.co.uk if you don’t mind and want to laugh) It reminded me that I haven’t published anything about passwords recently.
     
    Regardless of whether you’re using a client PC, a server, a tablet, a smartphone, or any other digital device to access information, choosing strong passwords for the programs and services you use is essential. While a weak password may not always be the primary cause of IT services getting hacked, it can contribute greatly to the scope and severity of a hacking attempt.

    To help you make the most secure passwords possible – both for administrators and for end-users – I’ve assembled ten bits of advice that should improve the strength and effectiveness of your passwords. Each of these suggestions don’t provide enough security on their own, so I’d strongly suggest adopting as many of these tips and techniques as possible.
     
    There are several third-party applications that can help you automate and enforce password policies or allows users to reset their own passwords, like ManageEngine’s ADSelfService Plus and Specops Software’s Password Reset tool.
     
    1. Adopt a password change policy
    One of the best defenses against stolen passwords is to frequently change the passwords being used. There’s always a balance between security on one hand and usability on the other, so forcing passwords changes too often can lead to an excessive burden on users. Who wants to be forced to change their passwords every few weeks? Enforcing password changes every six months provides a good balance between frequency and security.
     

    2. Use caps and special characters
    Passwords that consist solely of traditional text characters can be easier to guess and hack by attackers. For example, a password like “waffle” can be easier for an attacker to guess using brute force methods, but something like “waff!E” would be much more difficult and time consuming to crack.
     
    3. Avoid common words
    It’s a tired old joke in IT circles: Many users adopt some of the most obvious and most insecure words for their passwords, ranging from the ubiquitous “password” to using their first names, name of their company, or the brand name of the monitor they were staring at when they came up with the password. A password policy that doesn’t allow users to use common, easily guessed words like “password” can help improve security substantially.
     
    4. Develop a nonsense phrase
    Some of the best passwords are based on nonsensical phrases that only make sense to the user that created them. For example, if the user has a dog named Ranger that likes to catch Frisbees, creating a password like “dograngerfrisbee” will be easy for the user to remember, but hard for attackers to easily crack using brute force methods or by guessing.
     
    5. Enforce a minimum password length
    In addition to comically simple passwords like “password” and “beer,” another common problem with many passwords are that they’re simply too short. “ABC” and “123” may be easy to remember, but they’re equally as easy for attackers to compromise. Enforcing a minimum password length of at least eight characters is yet another way to increase the complexity of assigned passwords.
     
    6. Don’t share passwords
    In this era of shared cloud services, sharing a common password can be an security Achilles heel into any organization. The marketing department may be using a SurveyMonkey account to generate web surveys, but multiple users use that one account. We’ve all seen sticky notes with passwords affixed to monitors, so work with your HR department to make sure that users know the importance of keeping passwords limited to only people that needs access. If a passer-by can easily spot and use a password in a public place, you have to assume that someone with more malevolent intent could see it as well.

        

    7. Create unique passwords for each service
    One of the ways that password breaches can become exponentially more damaging is if users are employing the same password on multiple services. If an attacker gains access to one service – say the corporate Facebook account – then he or she could potentially reuse that username and password combination on dozens of other cloud services.
     
    8. Use a password management service
    Sometimes the best approach to password complexity issues is to use a password management services like KeePass, Kaspersky Password Manager, or LastPass.
    All of these password managers follow the same basic idea: Rather than having to remember all of those individual passwords, the password manager does that for you, automatically filling passwords in when needed. The only password you need to remember is the one to the password manager itself.
     
    9. Adopt two-factor authentication
    Outside of following all the steps described above, sometimes the most secure approach is to adopt two-factor authentication in conjunction with strong passwords. This requires users to enter a code generated by a separate application or device in order to login. These can take the form of something like RSA’s SecurID two-factor authentication token – a small hardware device that randomly generates an authentication code – to a software application, such as the two-factor authentication app that you can configure for Google Mail.
     
    10. Use a biometric fingerprint scanner
    Biometric hardware has advanced significantly in the past few decades, so authentication methods which once seemed like science fiction – the ability to scan fingerprints, analyse typing patterns, or recognize a users facial features – are now a reality. One of the most widely used biometric devices is a fingerprint scanner, which does exactly what the name implies: The user swipes her fingertip across the scanner to have her fingerprint read and to gain access. Major PC hardware vendors such as Dell, HP, Toshiba, and Lenovo offer laptops with integrated fingertip scanners, and several vendors 
    sell stand-alone scanners that can be connected to any PC.

     

     
     

     

    Import Facebook events into your Outlook calendar

    Find out how to view special events such as birthdays stored on the Internet in an Outlook calendar on your local system.
     candles-birthday-cake.jpg
     
    There’s a ton of information on the Internet, and thanks to Outlook, you can access some of it locally. Using Outlook’s Internet Calendars, you can synchronize special events stored on the Internet in your local copy of Outlook. Fortunately, the process of linking to these sources is easy.

    About Internet Calendars

    Before we launch into an example, here’s an overview of these special calendars. Internet Calendars are calendars that we view on the Internet. They’re based on a global standard that allows us to exchange information without consideration for the hosting application. These files use a format known as iCalendar and use an .ics extension.
    Outlook supports two types of Internet Calendars: snapshots and subscriptions. You’ll send snapshot calendars using email. This calendar is a one-time review that isn’t linked to a source calendar, so it won’t update when someone changes the source.
    To send a snapshot, open a new email message and click the Insert tab. You’ll find the Calendar option in the Include group. You can customize the calendar by deciding the amount of information that you send, as you can see in Figure A. For instance, you can specify a date range, busy status, and other appointment details.
    Figure A
     

    OutlookFacebook_Fig1_012814.jpg

     
    The recipient can open the file as an Outlook calendar. In addition, the recipient can drag events from the snapshot calendar and use Outlook’s overlay feature (which we’ll learn about later) to visually merge the received calendar with their own. Some people use this feature to quickly back up their calendar(s). You can do so by selecting the calendar, and then clicking the File tab and choosing Save Calendar.

    Importing Internet events

    Snapshot calendars are static, but Outlook’s subscription calendars synchronize with the source calendar (stored on a web server). You download the calendar file to your local Outlook version. When the hosting site updates the calendar, those updates are downloaded to you.
    Facebook is a great example because so many people have accounts. You can download two calendar files: your friends’ birthdays and events. Let’s download the birthday calendar.
    1. Log in to your Facebook account.
    2. Click the Events link on the left (in Favorites) (Figure B).
    Figure B
     

    FacebookOutlookFigA012714.jpg

     
    3. In the top-right corner, find the gear icon, and click its dropdown arrow.
    4. Choose Export (Figure C).
    Figure C
     

    FacebookOutlookFigB012714.jpg

      
    5. Figure D shows the resulting dialog. You can click either of the circled links to begin the process. If you’re following this example, click the birthday link.Figure D
     

    FacebookOutlookFigC012714.jpg

       
    6. If Outlook is your default client, Windows will select it for you (Figure E). In this case, click OK. If Outlook is not your default client, select Outlook from the list or click Choose (if necessary). Figure E 
     

    FacebookOutlookFigD012714.jpg

      
    7. Click Yes to confirm that you’re subscribing to Facebook’s birthday Internet Calendar. Outlook will import the birthdays into a new calendar. This new calendar will automatically synchronize with Facebook to add, delete, or modify birthdays as they’re updating in Facebook. If you unfollow a friend in Facebook, Outlook will delete that friend’s birthday from your local calendar.
    If the above process doesn’t work for you, follow these steps to add the calendar manually.
    1. Repeat steps 1 through 4.
    2. Right-click the appropriate link (see Figure D) and choose Copy Shortcut or Copy Link Location.
    3. Open Outlook.
    4. Open the Calendar window.
    5. On the Home tab, click Open Calendar in the Manage Calendars group. If you’re using Outlook 2003 or Outlook 2007, choose Account Settings from the Tools menu.
    6. In the resulting dialog, paste the copied link (Figure F).
    7. Click OK.
    Figure F 
     

    OutlookFacebookFigF012814.jpg

      
     
    Outlook will display birthdays and events in separate calendars, separate from your Outlook calendar. I don’t recommend combining them, but you can use Overlay to visually merge them. 1. Click the View tab.
    2. Click the birthday calendar and then click Overlay in the Arrangement group; this will display the default calendar (yours) and the birthday calendar as one. In Outlook 2007, choose View In Overlay mode from the View menu. In Outlook 2003, use the Side-By-Side Calendars feature by checking individual calendars, accordingly.
    Overlay is a visual enhancement; this feature won’t actually combine calendar files. You can add several calendars to the overlay, adding each one at a time.

    Stay in touch

    Outlook’s Internet Calendars are a flexible tool. You can share static appointments via email or download interactive calendars from the Internet into your local version of Outlook. Either way, you’ll have the information you need at your fingertips.

    By Susan Harkins January 28, 2014

    About

    Susan Sales Harkins is an IT consultant, specializing in desktop solutions. Previously, she was editor in chief for The Cobb Group, the world’s largest publisher of technical journals