bit-tech.net

Technology Apathy Is Bad For Everyone

Posted on 12th Apr 2012 at 10:03 by Paul Goodhead with 37 comments

Paul Goodhead
As you’d expect, I’m often sought out by friends and family when they’ve got a PC or laptop buying decision to make - chances are many of you are too, given that you’re here on bit-tech. I don’t mind doing it of course, but I’m always intrigued by the approach some people take when buying a new computer.

The aspect that surprises me the most is the way that many of the people who have sought my advice over the years have a ‘just enough to get me by’ approach to computing - they’re only looking for a PC or laptop that will perform the tasks they do now. This is their prerogative obviously - it’s their money after all - but I’ll always challenge them on it.

Why, I’ll ask, are you only looking for something that will satisfy you now - why not dream a little; why not expand your horizons and see what difference a little more computing power will make to your life? Invariably the answer boils down to the fact that they don’t see their computer usage changing at any time in the near future - they do some word processing and internet browsing and that’s about it, why would they need to buy a computer that can deal with more than this?

Technology Apathy Is Bad For Everyone *It's our Responsibility to Keep Computing Moving Forward


Unfortunately life is rarely that simple or consistent, things can change at a moment’s notice, and things you never even considered before can start to have an effect on how you use your PC.

As an example, I recently got into digital photography and have, as a result, found myself editing images and conducting photomerges of large RAW files. This hasn’t posed me any problem, as I’ve got a capable PC at home that can deal with this kind of work load (despite being designed primarily for gaming), so I’ve been able to get on with the fun business of actually taking pictures.

If, on the other hand, I’d have had a piddly little 15in laptop at home (which was all I needed to surf the web and view occasional documents) then my new hobby of digital photography would have quickly become a chore; photos would stay unprocessed on the SD card, and I’d slowly stop using my camera.

It’s not just new equipment that can change usage patterns, a new job could have the same effect, as could having a child start school. Even things like illnesses can change how people put their PC to work - how many of our folding team got into it because they know or knew somebody who was affected by one of the diseases caused by mis-folding proteins?

Technology Apathy Is Bad For Everyone *It's our Responsibility to Keep Computing Moving Forward

The tech industry is of course doing its best job to push people on from this ‘just enough’ attitude, it’s in their interests after all. It’s a tough job though - the world is so saturated with advertising these days that very little gets through.

As a result I believe it falls to people like us, the tech enthusiasts that often provide advice for family and friends, to take up some of the burden. I’m not talking about forcing your mum to buy a watercooled supercomputer she clearly doesn’t need, I’m just suggesting that we should be prepared to explain to those that we help what the benefits of trading up a little are. - ‘Sure, that is a lovely little netbook, but it won’t be any good for showing the videos of little Timmy’s first steps on will it?’.

We are after all the ones that’ll benefit from a healthy and prosperous tech industry. Yes, we’re more concerned with the enthusiast end of the market, but many of the companies that operate here have their foundations in the mass market, and if that stagnates, it’ll be bad for us all.

37 Comments

Discuss in the forums Reply
enciem 12th April 2012, 11:41 Quote
I always go with "You'll need to buy a reasonable computer because of your propensity for clicking on adverts and downloading countless toolbars for internet explorer and swathes of malware"
Elton 12th April 2012, 11:54 Quote
I do agree. However it is difficult to sell someone the idea that getting a bit more headroom is a good idea.

Mind you most of the relatives I have I just give them either my old stuff or recommmend they get slightly older and massively cheaper hardware.
OzThe2 12th April 2012, 12:08 Quote
I've been 'burnt' before by giving out advice and now I don't bother, instead pointing them to this handy guide from which they can make their own mind up: http://www.gpanswers.com/blog/139-non-gp-related/706-how-to-buy-a-laptop-for-2012-for-the-regular-person.html
WarrenJ 12th April 2012, 13:05 Quote
After doing a fair ammount of Photoshopping recently. My sufficient 4gb of ram will have to be upgraded to 8gb. Though when i built the PC, it was more than I needed.
Adnoctum 12th April 2012, 13:31 Quote
People might be more inclined to upgrade more often if manufacturers gave consumers what they want and not what is convenient or easy for the manufacturers.

Case in point: laptop screens. Who cares if laptops come with 8 cores and clocked to 4GHz if they come with the worst screens ever made? How many people are happy with shiny, 15.6in, 1366x768, low DPI screens with the most appalling colour reproduction ever seen?

I want to upgrade my mother to a new laptop because I'm getting tired of trying to support an 8-yo IBM with a Celeron M 1.6GHz, Mobility 7500 and 256MB of RAM. Every time it slows down, she complains and I have to re-image the system and apply various updates (which slow the damn thing down again!).
All because she HATES 16:9 screens and manufacturers don't make laptops with 4:3 screens any more. She likes 1024x768 on a 4:3 14in screen, or at least likes it better than 16:9.

I have been seriously thinking about picking up a refurbished Lenovo R60, last of the 4:3 laptops, which would come with a C2D of around 2GHz and should be reasonably fast for the foreseeable future.

As a point of interest, what do people use their laptops for?
The people in my social sphere use it for browsing the Internet and creating various documents, neither of which is enhanced by 15.6in 1366x768 screens.
What I find is that the same group of people don't use their laptops for is sitting down and viewing movies or watching Flash videos at full-screen, so why is this made the main feature of laptops? Other than the laptop manufacturer's fight to the bottom?

I'd pay a 30% premium for a basic laptop with a better screen, without having to spend $2000+ on a high-end system.
Bakes 12th April 2012, 15:38 Quote
Quote:
Originally Posted by Adnoctum
I want to upgrade my mother to a new laptop because I'm getting tired of trying to support an 8-yo IBM with a Celeron M 1.6GHz, Mobility 7500 and 256MB of RAM. Every time it slows down, she complains and I have to re-image the system and apply various updates (which slow the damn thing down again!).
All because she HATES 16:9 screens and manufacturers don't make laptops with 4:3 screens any more. She likes 1024x768 on a 4:3 14in screen, or at least likes it better than 16:9.

I have been seriously thinking about picking up a refurbished Lenovo R60, last of the 4:3 laptops, which would come with a C2D of around 2GHz and should be reasonably fast for the foreseeable future.

Do both of yourselves a favour and buy her an iPad. She won't have to worry about it slowing down, and you won't have to worry about re-imaging.

On an eight year old laptop, I suspect she'd only be using the internet, and maybe doing some word processing. At these processes, iPads are strong. Furthermore, the iPad has a 4:3 screen, thus fitting that requirement. iPads are supported well.

No matter what your views of what iPads can and cannot do are (I suspect from your comments about reformatting that you, like me, are a more hands-on user), I suspect that you'll agree that for simple tasks (such as those that can be done on an eight year old laptop), the iPad is a good alternative.

Obviously, I don't know your mum, so have probably got it all wrong.
schmidtbag 12th April 2012, 16:14 Quote
@Bakes
I thought the same thing. However, if this person is too cheap to get a newer laptop, chances are she won't want to get something as expensive as an ipad.

@Adnoctum
Well, just because you personally dislike the appearance of some screens, it doesn't mean everyone else does. When 1366x768 is a good resolution for a small laptop, tablet, or netbook. As for your mom not liking resolutions higher than 1024x768, what exactly is the reason? Why not just get a 1366x768 laptop and use 1024x768 as the resolution? If she doesn't like the physical width of the screen, then find something to prevent the resolution from stretching to both sides of the screen.


As for responding to this article, I both strongly agree and strongly disagree. I absolutely agree with the aspect of preparing yourself with something a little extra to ensure you don't need a replacement system in the future, but it isn't that simple:
1. Portable devices like laptops run noticeably cooler, faster, and power efficient every other year. The extra you spend on a laptop intended for minimal purposes could be spent later on a new laptop that will inevitably be purchased somewhere along the road.
2. Portable devices' batteries have a shelf life. Using 1 laptop for a very long time would end up meaning that battery will eventually need to be replaced, and we all know how expensive batteries are sometimes.
3. Some people's definition of the "bare minimum" is VERY different. People on bit-tech's forums act like 2GB should be the bare minimum and a 1.66GHz intel Atom is complete garbage. If you run Vista or 7, then yes, that hardware might be a little on the low side, but to me that is plenty for web browsing and office stuff. My netbook has a 900MHz Celeron and 1GB of RAM running linux, and for over a year of constant use, it still runs faster than the average Windows XP machine I've encountered.
4. Not everyone has the money to spend on hardware that is better than the bare minimum.

Sometimes it is better to let people buy the crappier device so they know not to go that cheap in the future. But, I've known people who have done the complete opposite of what you suggested - they come to you for advice and then they buy something overkill. I knew someone who bought a quad core with 4GB of RAM strictly for web browsing and email and nothing else - not even office or media, and no antivirus installed.
nilesfoundglory 12th April 2012, 16:29 Quote
Quote:
Originally Posted by Adnoctum
People might be more inclined to upgrade more often if manufacturers gave consumers what they want and not what is convenient or easy... Who cares if laptops come with 8 cores and clocked to 4GHz if they come with the worst screens ever made? How many people are happy with shiny, 15.6in, 1366x768, low DPI screens with the most appalling colour reproduction ever seen?

I know this is alpha-geek heresy, but there's a reason why Apple is doing better than most in laptop sales. A 13" MacBook Air comes with a better screen resolution than most 15" Windows laptops, despite its otherwise-puny specs.
Quote:
Originally Posted by Adnoctum
As a point of interest, what do people use their laptops for? The people in my social sphere use it for browsing the Internet and creating various documents, neither of which is enhanced by 15.6in 1366x768 screens.

From what I can tell (at least here in the States in a metropolitan area), they use their laptops for everything. Movies, photos, web, research, gaming (light & heavy), art... In most cases, it's the only computer they own. Despite my rational objection, 16:9 is a good aspect ratio for all of this. Heck, Photoshop became easier by leaps and bounds for me the instant I moved from a 4:3 to a 16:10 (and, eventually, a 16:9) display because I could finally fit all of my open palettes next to a 3:2 or a 4:3 picture without encroaching upon the frame of what I was working on. Spreadsheet maniacs love it because they get more horizontal space to work with.
Quote:
Originally Posted by Adnoctum
What I find is that the same group of people don't use their laptops for is sitting down and viewing movies or watching Flash videos at full-screen, so why is this made the main feature of laptops? Other than the laptop manufacturer's fight to the bottom?

First: Just because you don't bear witness to it doesn't mean it's not occurring. Second: Yes, there's always going to be a race to the bottom. For current examples, see tablet computing, entry level DSLR cameras, smartphones, and practically any unessential-to-human-survival consumer appliance.
Quote:
Originally Posted by Adnoctum
I'd pay a 30% premium for a basic laptop with a better screen, without having to spend $2000+ on a high-end system.

So, you'd like an Apple computer, then? Now I don't feel so bad for my earlier statement ;)
Material 12th April 2012, 16:32 Quote
Quote:
Originally Posted by schmidtbag

Sometimes it is better to let people buy the crappier device so they know not to go that cheap in the future.

I actually meant to mention this in the blog but forgot.

If this happens, if people are allowed to buy hardware that wont be up to the task then when things go wrong they will blame the hardware, not themselves or their purchase decision. Humans are generally pretty bad at admitting mistakes, especially expensive ones where there is some hardware they can blame instead.

This'll just lead to people becoming increasingly anti PC - the experience needs to be as smooth and simple as possible to convince people that they need a PC in their life, and you'll only get that by having the correct hardware or setup.
nilesfoundglory 12th April 2012, 16:37 Quote
Quote:
Originally Posted by nilesfoundglory
Despite my rational objection, 16:9 is a good aspect ratio for all of this.

Whoopsie. Meant to say: "1366 x 768 is a good resolution for all of this." Reason being is it doesn't stress the hardware and effectively maintains a certain level of 'expectation' for the user. 720p for a movie on a 15" laptop is more than enough unless your blind-granny nose-to-the-screen close, looking for pixel imperfections. Same can be said for gaming. Few widely-played, currently-played, non-hardcore games stress the underlying hardware to the point where anyone would notice slowdown. A few years down the line, someone might notice their system is no longer up to snuff - which would probably around the time the hardware is out of warranty and a new computer would be considered anyway.
Adnoctum 12th April 2012, 18:34 Quote
I've walked away and had a bit of a think about the article.

Funnily enough, when I'm making a purchase for myself, my family or my work, I'm not thinking about a "healthy and prosperous tech industry", I'm thinking about the needs of the purchaser and the available resources.
It isn't the job of consumers to subsidise the tech industry by purchasing more resources than would reasonably be needed.
It certainly isn't my mum's job to spend her pension on things she doesn't need in order to churn products for multi-nationals. Nor that of my less informed friends. Nor even myself.

If you think that I am misrepresenting the article, I disagree. That is what is being said - "Buy more expensive stuff (on the off-chance they may need it in the future...maybe), so the tech industry makes more of the better stuff".

I think there is a disconnect between what people think they will do on their laptops (especially at point of sale), and what they actually do. Which is where I disagree with the article.
I think people think about all the things the could be doing with the laptops, but find they never do in practice. This is especially the case if they have bought with the expert help they receive from the retail outlet. Did you see the seething sarcasm there?
Naturally, this is not a blanket comment, but just my opinion about what I see. People keep talking about all those images and home videos and whatever they get enthused about, but they don't actually do it when it comes to doing it.
Timmy's netbook failure has more to do a misinformed consumer and has nothing at all to do with the weakness of Atom processors or the netbook platform. I have an Atom-based system that functions perfectly adequately because I don't expect it to be anything more than it is. Atom is a bit weak for my mum, but a Brazos-based multi-core system would be perfectly fine (if it came with a 4:3 screen).
Adnoctum 12th April 2012, 18:48 Quote
@ Comments.

Sadly, my mum'd throw a fit if I presented her with an iPad.
I installed Ubuntu on her laptop and made the interface as XP-like as possible. She was already using Firefox and didn't like the Ribbon-interfaced Office 2007 so I thought she could easily get to like Open Office.
I know, it was inspired, right? She lasted two weeks and I had re-image the laptop.
It's not as if she doesn't like new things (she instantly liked using Tabs in Firefox), she just doesn't like new things for the sake of new things. Gadgets are wasted on her! And she doesn't like lots of changes all at once. Introducing an iPad would probably end with a crime scene and a police investigation.

I didn't say that people don't use laptops for videos, I said that in my social group (which encompasses both my personal life which spans retirees like my mother, casual users, and enthusiasts and gamers to my professional life which is in a very mixed business/office situation which includes IT professionals) people aren't using their laptops for much more than Internet browsing (including casual Youtube videos) and document creation. What you and your acquaintances do is going to be as different as mine.

16:9 is great if your activity needs width (movie watching, spreadsheets, audio/video editing, etc), but terrible for anything requiring height (everything else).

At work, if I had a dollar for every time someone complained to me about how awful their super laptop is with a 1366x768 screen, I'd estimate I could buy a return airfare to Bali (and yes, I just did a check of airfares!). An internet search would show forums full of similar people.
If a laptop manufacturer offered a product with a 4:3 screen with a 20-30% cost premium, it would be a big seller to business/office users.
One office I know has all laptops as desktop replacements. Some "manager" thought he/she was smart, and justified themselves with some dubious reasoning. Workers could take their work with them even though they don't and has given IT more security and network access headaches to sort out. They've become little more than terminals, because the workers have to use more ergonomic USB mice and keyboards, and they're hooked up to (:)) 17in 4:3 screens because the laptop screens (raised up on expensive arms) proved massively unproductive.
People couldn't function properly with 1366x768, but are perfectly fine with 1280x1024. It is not just a matter of what they are used to, because they all go back home to their crappy little 1366x768 laptops.

My current laptop has a 12.1in screen with a 1280x800 resolution (LP C2D, not Atom) and it is for mobility use and some Internet browsing. I'd get a tablet, but I like the keyboard. The fact that it is a full power C2D doesn't hurt.
I could never use Photoshop on it because of the terrible colour reproduction of the TN panel. Compared to my IPS desktop monitor, it is like comparing real life to dog vision, and this was a premium $2000 laptop. It is so...I'm sorry, I have to say it...s!*t I was shocked by how washed out colours were when I first used it. Newer TN-based laptop screens aren't as bad, but...

I wouldn't mind paying an Apple-like premium for a better screen, but it is all the other design choices that Apple makes that I object to. I don't want my laptop to be a fraction of an inch thick and light enough to blow away in a slight breeze, and I especially don't want to send the laptop back to Apple to change the battery and forego a DVD drive (I still use mine, thanks).
I think I'd have to give up on life if I had to own an Apple laptop (and yes, I've used several with OSX). I suspect that I'd have to install a Linux distro on it. It would sicken me on several different emotional levels, and this is from someone who owns more 680x0 and PPC-based Apples than he could reasonably explain.

Another pet hate: PC stuff that get power via Molex plugs (fans, fan controllers, PCIe adaptors). Please, please, move to SATA power connectors already! I have two PCs that don't have Molex connectors and I had to buy a SATA->Molex adaptor on eBay.
schmidtbag 12th April 2012, 19:44 Quote
@Adnoctum
I pretty much entirely agree with your 2nd last post, you made a lot of valid points.

Regarding your 2nd post, I understand how your mom doesn't like change, but sometimes change is necessary. I knew someone who wanted a multi-monitor setup but wanted his monitors set to 1024x768 just because it was more familiar to him. I told him that if he just increased the resolution to the native amount (which was nearly twice that) then he could probably drop 1 entire monitor, but he didn't care. It isn't difficult to make linux operate nearly identical to windows, I did the same thing for my mom as you did with yours (except used debian instead of ubuntu) and she has yet to find any problems with it.

And yes, I understand that people don't want to move out of their comfort zones and they shouldn't need to either, however, people who can't accept something like a graphical change or a different (possibly faster/easier) way of doing something are the same people who cause problems like HTML5 not being adopted several years ago. Some people just need to suck it up and deal with the newer things whether they like it or not. If you're not comfortable with the switch then maybe you be in a rapidly expanding market. Nothing states you MUST use a computer on your free time, it is entirely optional.

And again, get a widescreen monitor but set it to a 4:3 ratio. There are ways to make it not stretched out. And 16:9 resolutions are practical for everything, they offer the same height of pixels as 4:3 monitors, just additional width. If you really need the extra height, rotate the monitor 90 degrees and you've got a super long display. I think people not functioning with 1366x768 due to "display quality" are coincidental. That is a very typical and reasonable resolution of small laptops or large netbooks. I have yet to see a display like that which I strongly disliked. I'd like to know what brand you and people you know are getting that has such awful displays.

There are macbooks with removable batteries and contain a dvd drive and are just as portable as your 12.1" laptop. I'm not sponsor of apple, I'm just saying your complaint isn't valid. I'm not sure why you hate their computers so much. Sure they're ridiculously expensive and contain mediocre hardware, but they're still designed well (not the cheap macbooks, those suck) and have nice features.

I would also blame you for getting a PSU that doesn't use molex connectors. Personally, I like them because they're more flexible than SATA connectors, since they don't have as many wires. They're also easier to "hack" if you make your own hardware/mods.
Deders 12th April 2012, 19:55 Quote
The last laptop I was asked to check out, I pointed out that they could probably get a bigger screen that wasn't as shiny, and that they would make much better use of a larger hard disk than they would of the Ram that was in there (500GB and 6GB). All this was completely irrelevant to her, she had already decided on that one because it was..... blue.
NethLyn 12th April 2012, 20:14 Quote
Good blog post and like OzThe2 I got burnt and after that, told that particular person asking advice to stick to buying Dell as I wasn't putting myself out for the hassle anymore.

As for my Mum, she was happy that Vista looks like 7 (somewhat) so she didn't have to learn much that was new, and I set all the browsers to take her straight to web mail but she was conscious of an older PC's slower speed. So instead of any more upgrade money spent on an older machine I used to use, she bought a laptop instead, but the screen was fine for her, she just hated the smaller keyboard and trackpad so doesn't use the portable as much even when I added a real mouse.

So for the next PC, she'll be on the Dell trail as well most likely, as the next door neighours are laughing with a Dell Q6600 running Vista, which is powerful enough to upgrade to 7 or 8 and still do everything they want.

As for Paul's views in the blog post, at least nowadays you could tell someone to get an AMD laptop if they're even thinking about playing games more powerful than Solitaire - then they would be covered for as long as they could be bothered to seek out the mobile GPU driver updates.
yodasarmpit 12th April 2012, 20:29 Quote
In reality 90% of the population use a PC to surf the web, it's not a work tool.
That, I guess is why iPads and the like are so popular, they accomplish there main task out of the box.

A low end Dell set up will be more than enough for the majority of people, whilst not breaking the bank.
warejon9 12th April 2012, 20:34 Quote
I think another reason for ditching the 4:3 screens, was that 16:x you can have two A4 sized documents open at the same time, so when using your laptop to write up documents/copy them it is a lot easier.
Xlog 12th April 2012, 20:34 Quote
Quote:
Originally Posted by schmidtbag

And again, get a widescreen monitor but set it to a 4:3 ratio. There are ways to make it not stretched out. And 16:9 resolutions are practical for everything, they offer the same height of pixels as 4:3 monitors, just additional width. If you really need the extra height, rotate the monitor 90 degrees and you've got a super long display. I think people not functioning with 1366x768 due to "display quality" are coincidental. That is a very typical and reasonable resolution of small laptops or large netbooks. I have yet to see a display like that which I strongly disliked. I'd like to know what brand you and people you know are getting that has such awful displays.

It might be the case for a desktop monitor, but you can't rotate a laptop screen, and 768 for vertical resolution is crap - all you do is scroll, it's like going back 12 years. 16:9 screen offers the same vertical resolution as 4:3 only if you go for full hd, for a laptop that means 17"+ or paying 1200+€.
Bakes 12th April 2012, 23:20 Quote
Quote:
Originally Posted by Adnoctum
@ Comments.

Sadly, my mum'd throw a fit if I presented her with an iPad.
I installed Ubuntu on her laptop and made the interface as XP-like as possible. She was already using Firefox and didn't like the Ribbon-interfaced Office 2007 so I thought she could easily get to like Open Office.
I know, it was inspired, right? She lasted two weeks and I had re-image the laptop.
It's not as if she doesn't like new things (she instantly liked using Tabs in Firefox), she just doesn't like new things for the sake of new things. Gadgets are wasted on her! And she doesn't like lots of changes all at once. Introducing an iPad would probably end with a crime scene and a police investigation.

I didn't say that people don't use laptops for videos, I said that in my social group (which encompasses both my personal life which spans retirees like my mother, casual users, and enthusiasts and gamers to my professional life which is in a very mixed business/office situation which includes IT professionals) people aren't using their laptops for much more than Internet browsing (including casual Youtube videos) and document creation. What you and your acquaintances do is going to be as different as mine.

16:9 is great if your activity needs width (movie watching, spreadsheets, audio/video editing, etc), but terrible for anything requiring height (everything else).
I see your issue about the whole iPad thing.

I generally find that a massive regime change is taken better than subtle reform. This is in my view one of the problems with Linux - for so long it was focused at being a Windows replacement that it gained no traction - because there were many things it did worse than Windows. Now, we have Unity and other interfaces, and it's becoming more popular because it's becoming a genuine alternative, it's no longer just trying to look like Windows and is such succeeding.
I bought an iPad recently, and though I only expected to use it for stuff like Netflix, I've ended up using it for most of my daily computing needs. For a risky strategy, you could give her an iPad, show her word processing and suchlike, show her how to do most of her daily things on it, and see in a couple of months whether she's still using her laptop much. The risk of course is that you've just given her a £400 device, and she might not use it at all.

I bought my mum an iPhone last year, and she uses it for Sudoku and Scrabble. £500 well spent right there.

Two solutions to the whole 16:9 problem.
1) Use a 16:10 monitor -- it's a fair bit closer to 4:3.
2) Get a massive monitor (with an appropriately massive resolution). My 2560x1600 gives as much vertical real estate as a pretty big 4:3 screen gives horizontal.
XXAOSICXX 12th April 2012, 23:32 Quote
When I'm helping friends and family with deciding on a technology-related purchase I help them the most by helping them redefine "bare minimum".

You can, if you choose, build a working computer for less than £200 out of budget OEM parts. You will, of course, end up with faulty components all of the show over the life of the computer and this is naturally a false economy and does nothing more than compound the users view that technology is, and should remain, alien to them.

Thus, my approach - whether it's PCs, laptops, cameras, phones or whatever - is to look at the bare-minimum they can get away with for their typically limited budget that *doesn't* compromise on quality.

I think we're often spoiled by technology and how cheap much of it has become...and that's led to some shockingly bad products at the lower end of the market.

I bought a top-of-the-line Garmin sat-nav about 5 years ago that set me back almost 500 quid and I'm still using it to this day (albeit with updated maps) and it really is superb.

My super-sceptical boss who hates sat-navs and thinks maps are the future eventually bought a bottom of the range Tom Tom (*spits*) for his wife who absolutely hates it because it's almost completely useless. They wasted 90 quid on a turd of a product when they could have spent just a bit more, say, £150, and had something a bit more respectable that they'd still be using.

It's a bit like shopping for alcohol....you always want the best one, you can only really afford the "average" one, and you'd never ever ever buy the supermarkets' own brand. Bottom-rung products simply shouldn't be made...people would be far happier with their purchases all round.
vampalan 12th April 2012, 23:39 Quote
http://www.youtube.com/watch?v=C5z0Ia5jDt4
Watch this if you haven't already done so. So, what happens when people ask me what computer to buy, it's get a Mac, buy iCare or what ever its called, and leave me alone.
But if the question was which bits to buy to play BF3 as fast as possible.. that's different.
Sloth 13th April 2012, 00:43 Quote
Quote:
Originally Posted by XXAOSICXX

I bought a top-of-the-line Garmin sat-nav about 5 years ago that set me back almost 500 quid and I'm still using it to this day (albeit with updated maps) and it really is superb.

My super-sceptical boss who hates sat-navs and thinks maps are the future eventually bought a bottom of the range Tom Tom (*spits*) for his wife who absolutely hates it because it's almost completely useless. They wasted 90 quid on a turd of a product when they could have spent just a bit more, say, £150, and had something a bit more respectable that they'd still be using.
I've been trying to get my dad, an avid hiker, skier and general adventurer, to consider getting a GPS unit but it's been a tough battle. Every time he starts getting close to buying one it's a low end unit which I can tell won't live up to his expectations, but since he's never used anything like it before he won't spend hundreds of dollars on a first purchase. There have been signs of progress, though! I could see a spark of interest showing him Google Maps on my phone.
Quote:
Originally Posted by vampalan
http://www.youtube.com/watch?v=C5z0Ia5jDt4
Watch this if you haven't already done so. So, what happens when people ask me what computer to buy, it's get a Mac, buy iCare or what ever its called, and leave me alone.
But if the question was which bits to buy to play BF3 as fast as possible.. that's different.
Unfortunately that's becoming my stance as well. It fits their need, perhaps not as cheaply as it could, but most importantly they're happy and afterwards if/when they have any technical Mac problems I can say "I told you so" and direct them to the extensive Apple support network rather than fixing/explaining it myself.
leexgx 13th April 2012, 02:56 Quote
Quote:
Originally Posted by NethLyn

As for Paul's views in the blog post, at least nowadays you could tell someone to get an AMD laptop if they're even thinking about playing games more powerful than Solitaire - then they would be covered for as long as they could be bothered to seek out the mobile GPU driver updates.
intel its easy now
any i3+ or celeron dual core (as its just an i3 with some bits choped off) you can never get an bad laptop with i3 or mobile Celeron dual core (just buy one that has an good touch pad)

you got to be care full with AMD laptops as most shops and laptop makers are miss selling the 1.3ghz Dual-core E-300 or C-50 1ghz dual core AMD cpus in full size laptops when they are for netbooks, that are not much faster then an ATOM cpu (but the GPU is faster thought)

got one customer that got an e-300 and it was bigger screen 750gb hdd 6gb ram but the system was sluggish if there was 1 core been maxed out (even had the mouse stop moving intermittently)

http://www.pcworld.co.uk/gbuk/laptops/703_7006_70006_xx_xx/xx-criteria.html the first 3 (C-50/E-300) laptops are slower then C2D laptops and are an waste of money the Intel Celeron b815 is ok as its based on sandy bridge even thought its running at 1.6ghz and the N AMD based laptops are ok

(cpu would currantly has routeing issues in the UK at the time of posting use Opera with Turbo if it not load the site, its the 4 icon from the bottom left had corner with the speed dial on it to enabled access to the site)

http://www.cpu-world.com/CPUs/Bobcat/TYPE-C-Series.html are an waste of money (for netbooks only or ultra portables mabe, should Not be in full size laptops)
http://www.cpu-world.com/CPUs/Celeron_Dual-Core/Intel-Mobile%20Celeron%20B815.html ok as they are based of sandy bridge
http://www.cpu-world.com/CPUs/K10/AMD-Athlon%20II%20Dual-Core%20Mobile%20N330%20-%20AMN330DCR22GM.html are ok as well as they are faster then 2ghz

only look at the A6 or A8 cpus on amd or dual core 2ghz or higher with amd, no reason to buy single core laptops now (only issue with that is most shops do not show the cpu spec pc world do on the product details page)

sorry if it looks like an AMD bash but this is more the laptop makers using an CPU what its not intended for, as i have 3-4 of my customers who have buyed an laptop that was slower then there old laptop that have broke (norm C2D or alike), would you use an atom for an full size laptop i would say no and Intel will not allow it, problem is AMD are not enforcing rules to limit the E and C to smaller netbooks or ultra-portables
schmidtbag 13th April 2012, 05:52 Quote
Quote:
Originally Posted by Bakes

I generally find that a massive regime change is taken better than subtle reform. This is in my view one of the problems with Linux - for so long it was focused at being a Windows replacement that it gained no traction - because there were many things it did worse than Windows. Now, we have Unity and other interfaces, and it's becoming more popular because it's becoming a genuine alternative, it's no longer just trying to look like Windows and is such succeeding.

I completely disagree with this. GNOME has been the default desktop environment of linux for a very long time, and it is not windows-like at all except the window decorations (title bar, minimize, maximize, cloze, etc) and the taskbar. KDE has been very windows-like, but also adopted many mac features. I like to think of it as if windows and mac had a child together who grew up to be rebellious and do things his own way. As many people on this forum have posted, people like familiar things. It is daunting to switch an entire OS and use a whole new collection of programs, and it becomes overwhelming to some people when everything looks and operates differently too.
When you look at the lesser-popular environments like XFCE or Englightenment, those are just simply weird. LXDE is probably the only one that is strikingly similar to windows out-of-the-box (windows 2000 anyway).

As for Unity, that is massively unpopular. If you go to distrowatch.com, you'll find that ubuntu's popularity has dropped by nearly 1000 hits-per-day since unity became the default. Linux in general is becoming more known and accepted as a desktop OS coincidentally around the same time that unity came out because its starting to do better than windows at nearly everything. Linux's nvidia drivers perform almost on-par with windows, and the intel drivers are faster than windows'. CPU and RAM usage has always been better with linux for over 10 years now. Nearly all modern wifi, audio, and video capture devices work out of the box. EXT4 and Btrfs filesystems are either superior in features or superior in performance to NTFS. These things, as well as many other reasons, are why linux is gaining popularity. The desktop environments I'd say have a minimal (but an existing) impact.
erratum1 13th April 2012, 08:04 Quote
The problem with this article is that new hardware comes out prices fall new technologies are created.

You can spend thousands on a pc with the hope that your going to use that power on something and within 6 months it's old.

Buy a pc for what you need it for, if you get into photography later then the prices will have dropped by then or new faster cpu's will be out.
PingCrosby 13th April 2012, 11:13 Quote
apathy?...who cares
lacuna 13th April 2012, 14:07 Quote
I bought myself an iPad for Christmas and it has basically completely replaced my laptop and PC. I've used my laptop once and my PC no more than 5 times since getting it
Bakes 13th April 2012, 15:10 Quote
Quote:
Originally Posted by schmidtbag
I completely disagree with this. GNOME has been the default desktop environment of linux for a very long time, and it is not windows-like at all except the window decorations (title bar, minimize, maximize, cloze, etc) and the taskbar. KDE has been very windows-like, but also adopted many mac features. I like to think of it as if windows and mac had a child together who grew up to be rebellious and do things his own way. As many people on this forum have posted, people like familiar things. It is daunting to switch an entire OS and use a whole new collection of programs, and it becomes overwhelming to some people when everything looks and operates differently too.
When you look at the lesser-popular environments like XFCE or Englightenment, those are just simply weird. LXDE is probably the only one that is strikingly similar to windows out-of-the-box (windows 2000 anyway).

As for Unity, that is massively unpopular. If you go to distrowatch.com, you'll find that ubuntu's popularity has dropped by nearly 1000 hits-per-day since unity became the default. Linux in general is becoming more known and accepted as a desktop OS coincidentally around the same time that unity came out because its starting to do better than windows at nearly everything. Linux's nvidia drivers perform almost on-par with windows, and the intel drivers are faster than windows'. CPU and RAM usage has always been better with linux for over 10 years now. Nearly all modern wifi, audio, and video capture devices work out of the box. EXT4 and Btrfs filesystems are either superior in features or superior in performance to NTFS. These things, as well as many other reasons, are why linux is gaining popularity. The desktop environments I'd say have a minimal (but an existing) impact.

You've made the classic blunder of thinking that everyone has the same needs as yourself. I'll go through your post point by point.

It's not at all daunting to switch OS and use completely different programs. Apple has sold hundreds of millions of iOS devices, and they've been picked up by the users very well. It's far more daunting to use similar software, because you're looking for the visual cues you've been used to (but they aren't there). For example, switching from Windows to OSX would almost certainly be difficult for the majority of British people for the simple reason that lots of the keys are switched around, and the shortcuts are different. Meanwhile, the ribbon in Word 2011 is different to the ribbon in Word 2010, it contains different items and behaves slightly differently. This is yet another difficulty.
A new piece of technology is much easier to learn than a modified old one, because you don't need to unlearn what you already know.
Learning to use KDE is a problem for this exact reason - you expect something to work like Windows, but it doesn't, so you see if it works like OSX (it doesn't), so it becomes a massive chore, because you're constantly recalling your knowledge about other operating systems, which is not necessarily helpful.
Similarities are only useful if the behaviour is very similar in most use - it's why Open Office was so successful at the start.

Learning iOS is a much easier experience, purely because its behaviour is so different.

Unity is currently unpopular - but only because it's still immature software. It's been improving as its matured.

Now, performance;

My post was about a computer for an old lady who currently computes on a Celeron running at 1.6GHz. She seems to use her computer for the internet and for office. It's safe to assume that she does not give a rats arse about the superiorities of the nVidia drivers or the qualities of her filesystem.

If you're a systems engineer, sure, these are important considerations. For the majority of people, they are not - I am definitely a power user, and yet I place very little weighting on the filesystem used in my chosen operating system.

The vast majority of people do not care whether their computers use EXT4 or Btrfs or even something old like ReiserFS. They do not care whether their computers use bash or dash, or if their graphics performance is 0.05% better due to improved drivers. They care that they can use the operating system effectively.

On your list of improvements, the only one I can agree with is the increased peripheral compatibility in recent years. ndiswrapper was a pain in the ass.

But anyway, we're going off topic.
schmidtbag 13th April 2012, 17:14 Quote
Quote:
Originally Posted by Bakes
You've made the classic blunder of thinking that everyone has the same needs as yourself. I'll go through your post point by point.
No, I didn't. I don't use things like GNOME or XFCE or Unity. I've tried them, and their default setups are not like windows. I was referring to their default setups, not the way I set them up. The way I set them up makes my argument completely irrelevant.
Quote:
It's not at all daunting to switch OS and use completely different programs. Apple has sold hundreds of millions of iOS devices, and they've been picked up by the users very well.
Apple products are designed for simplicity and stability. Considering they haven't really changed a whole lot within the past 10 years (except iOS), I'd say they've done a pretty good job, and enough people are familiar with how macs works. Macs tend to offer less features than windows, whereas something like linux has a lot more. This is what makes linux daunting.
Quote:
It's far more daunting to use similar software, because you're looking for the visual cues you've been used to (but they aren't there). For example, switching from Windows to OSX would almost certainly be difficult for the majority of British people for the simple reason that lots of the keys are switched around, and the shortcuts are different. Meanwhile, the ribbon in Word 2011 is different to the ribbon in Word 2010, it contains different items and behaves slightly differently. This is yet another difficulty.
Yes, I agree with this. But this isn't so much daunting to people but frustrating or confusing. I'm talking about people who are trying something new that is so different that its discouraging for them to use it.
Quote:
A new piece of technology is much easier to learn than a modified old one, because you don't need to unlearn what you already know.
Another very valid point.
Quote:
Learning to use KDE is a problem for this exact reason - you expect something to work like Windows, but it doesn't, so you see if it works like OSX (it doesn't), so it becomes a massive chore, because you're constantly recalling your knowledge about other operating systems, which is not necessarily helpful.
....huh? First you're saying that its "not at all daunting to switch to another OS and use completely different programs" and you said its easier to use something new than old and modified, but now you're saying KDE doesn't work like mac or windows so its difficult to use. Kinda hypocritical. I don't remember any features in KDE that didn't do what I thought it would do, otherwise it only takes 1 more click to figure it out. I found Macs to be much harder to learn for the first time than KDE, because Macs actually lack some cues entirely. For example, a new user wouldn't know that you have to drag a program into the trash to uninstall it, and if they're stuck with a 1-button mouse, there's nothing that tells the user how to empty the trash.
Quote:
Unity is currently unpopular - but only because it's still immature software. It's been improving as its matured.
Yes, that is true. I'm sure Unity will be a worthy competitor in time. Not sure if I'll ever be interested in it.
Quote:
My post was about a computer for an old lady who currently computes on a Celeron running at 1.6GHz. She seems to use her computer for the internet and for office. It's safe to assume that she does not give a rats arse about the superiorities of the nVidia drivers or the qualities of her filesystem.
As true as that is, that isn't really relevant to this argument which is why I never brought it up in my previous post. People as of today don't use linux unless they see a compelling reason to switch. Although I find it easy, it is not user friendly to most people. So, people switch to it specifically because of its benefits, like how it handles hardware. My previous post only disagreed with how you felt about Unity and linux trying to be similar to Windows.
Quote:

The vast majority of people do not care whether their computers use EXT4 or Btrfs or even something old like ReiserFS. They do not care whether their computers use bash or dash, or if their graphics performance is 0.05% better due to improved drivers. They care that they can use the operating system effectively.
Right, so the very few people who are aware of what linux is and the very few people of that group who want to use it for their main desktop OS are the people who want it for its benefits. If they aren't, they're the kind of people who just want something free and not pirated, or get linux because of its viral immunity and eye-candy.
Quote:
On your list of improvements, the only one I can agree with is the increased peripheral compatibility in recent years. ndiswrapper was a pain in the ass.
Well a lot of what I mentioned isn't really noticeable to the average user or enthusiast, but there are benchmarks to prove a lot of what I said.
Bakes 13th April 2012, 23:54 Quote
Quote:
Originally Posted by schmidtbag
No, I didn't. I don't use things like GNOME or XFCE or Unity. I've tried them, and their default setups are not like windows. I was referring to their default setups, not the way I set them up. The way I set them up makes my argument completely irrelevant.
May I ask what you do use? If you do simply use the terminal, then I'd have to say that you aren't in any way the target of my post.
Quote:
Apple products are designed for simplicity and stability. Considering they haven't really changed a whole lot within the past 10 years (except iOS), I'd say they've done a pretty good job, and enough people are familiar with how macs works. Macs tend to offer less features than windows, whereas something like linux has a lot more. This is what makes linux daunting.

OS X has been fairly incremental, but it shows a whole lot more promise than it used to. Lion was a set of changes that looked boring on paper, but have made me far more productive.

Thanks to MacPorts and similar, OS X can use most of the libraries and packages written for other Unix based systems. Again, it's proved most helpful to me.
Quote:
....huh? First you're saying that its "not at all daunting to switch to another OS and use completely different programs" and you said its easier to use something new than old and modified, but now you're saying KDE doesn't work like mac or windows so its difficult to use. Kinda hypocritical. I don't remember any features in KDE that didn't do what I thought it would do, otherwise it only takes 1 more click to figure it out. I found Macs to be much harder to learn for the first time than KDE, because Macs actually lack some cues entirely. For example, a new user wouldn't know that you have to drag a program into the trash to uninstall it, and if they're stuck with a 1-button mouse, there's nothing that tells the user how to empty the trash.

Macs don't have one button mice anymore, fyi. They're all bundled with either a trackpad, or a multitouch mouse, for better or worse. I'm pretty sure there's an 'empty the trash' button in the trash window, and if you click the finder menu, there's also an 'empty the trash' menu item. Your point about uninstalling is fine, but you might not expect a new user to Linux to know to open Synaptic or YAST or apt or whatever in order to uninstall programs either. The OSX method is arguably more intuitive.

My point about KDE is that it's enough like both Windows and OS X that when I try to use it, I draw my intuition from both of them, and thus get caught out in the differences. My point is that KDE is close enough to both of them that it becomes a pain in the arse - the paradigms are subtly altered, as opposed to diverging enough that they are simply different.

I was not simply being self-contradictory.

Gnome does not have these issues. (My favourite is XFCE, if only for its file manager (right click - open terminal in this folder. Wonderful)
Quote:
As true as that is, that isn't really relevant to this argument which is why I never brought it up in my previous post. People as of today don't use linux unless they see a compelling reason to switch. Although I find it easy, it is not user friendly to most people. So, people switch to it specifically because of its benefits, like how it handles hardware. My previous post only disagreed with how you felt about Unity and linux trying to be similar to Windows.

Yes, I am aware that Linux is currently only used by those who see compelling reasons to switch - that in my view is the key reason why Linux distros should stop trying to put forward the whole 'You can use it to replace Windows' line. The closer it is to Windows it is in feature parity, the more people will complain about the parts that are lacking.
Quote:
Well a lot of what I mentioned isn't really noticeable to the average user or enthusiast, but there are benchmarks to prove a lot of what I said.

I don't doubt it (although I would note that you are on shaky ground in terms of CPU efficiency, since the application software themselves are simply machine code instructions.)
schmidtbag 14th April 2012, 01:12 Quote
Quote:
Originally Posted by Bakes
May I ask what you do use? If you do simply use the terminal, then I'd have to say that you aren't in any way the target of my post.
I use KDE 4 on my main computer, LXDE on my netbook, and I tend to use GNOME 2 for other people who want to try linux, except I heavily modify it so it isn't so foreign to them. I do enjoy using the terminal - knowing how to use it can really increase productivity.
Quote:
OS X has been fairly incremental, but it shows a whole lot more promise than it used to. Lion was a set of changes that looked boring on paper, but have made me far more productive.

Thanks to MacPorts and similar, OS X can use most of the libraries and packages written for other Unix based systems. Again, it's proved most helpful to me.
I agree, but the key word is "incremental", whereas going from Windows XP to Vista would be considered almost exponential. If you use OS X when it first came out and compare it to Lion, at a glance, it doesn't operate that differently (in a good way). There are a LOT of changes, huge improvements, new features, and so on but overall it "feels" the same.
Quote:
Macs don't have one button mice anymore, fyi. They're all bundled with either a trackpad, or a multitouch mouse, for better or worse. I'm pretty sure there's an 'empty the trash' button in the trash window, and if you click the finder menu, there's also an 'empty the trash' menu item. Your point about uninstalling is fine, but you might not expect a new user to Linux to know to open Synaptic or YAST or apt or whatever in order to uninstall programs either. The OSX method is arguably more intuitive.
Yes, I've seen a lot of newer macs take advantage of multiple button mice. Luckily for apple customers, the system is still functional with a 1-button mouse. I haven't tried Lion yet, and I don't remember if previous versions had the more apparent way of emptying the trash, but when I tried a mac for the first time, it took me a little too long to figure out how to empty the trash without right-clicking on it (I had a 2-button mouse but I wanted to see what it was like without one).
You are right about installing/uninstalling programs in linux, however, as I've said before, Macs are designed to be simple and easy to use, whereas someone who gets into linux requires to do some research ahead of time. The linux method of installing/uninstalling is more organized and logical, but its not very intuitive. The mac method is simple but has many little situational problems, such as not being able to remove orphaned programs/files and accidentally uninstalling something when you just wanted to remove a launcher. I'd say windows has had installing/uninstalling programs right from the very beginning, which to me is shocking considering how much they get wrong.
Quote:
My point about KDE is that it's enough like both Windows and OS X that when I try to use it, I draw my intuition from both of them, and thus get caught out in the differences. My point is that KDE is close enough to both of them that it becomes a pain in the arse - the paradigms are subtly altered, as opposed to diverging enough that they are simply different.

I was not simply being self-contradictory.
I suppose I can understand where you're coming from, although personally I never had much trouble learning to use it. Do you still have frustration with it today? If so, what do you find difficult about it?
Quote:
Gnome does not have these issues. (My favourite is XFCE, if only for its file manager (right click - open terminal in this folder. Wonderful)
Haha XFCE is probably the one I dislike the most - it strives to be lightweight but its just as heavy as GNOME 2 yet is missing half the features. Thunar is a nice program, but I like the file managers of the other DEs better. Dolphin (KDE's default FM) has an optional built-in terminal pane. In pcmanfm (LXDE's default FM), you can just press F4 and a new terminal window will pop up that is in the current directory of where you're browsing.
Quote:
Yes, I am aware that Linux is currently only used by those who see compelling reasons to switch - that in my view is the key reason why Linux distros should stop trying to put forward the whole 'You can use it to replace Windows' line. The closer it is to Windows it is in feature parity, the more people will complain about the parts that are lacking.
I don't disagree with that, what I disagree with is the fact you think they're trying to be like Windows. As I've said before, there's boundaries where things need to be familiar to some degree, or else it becomes daunting when nearly everything you try is different. Besides, if something is effective, why change it? Windows' interface has pretty much stayed the same since Windows 95, so that's about 18 years, so clearly its effective. Windows 8 will the be first drastic UI change (one I'm not fond of, its way too slow and clumsy) and even then, it still offers the classic UI.
Most open source software today is created to be what another product should have been, whether that means create something similar to Windows but have a few foreign quirks here and there, or to be exactly like another product but free.
Quote:
I don't doubt it (although I would note that you are on shaky ground in terms of CPU efficiency, since the application software themselves are simply machine code instructions.)
Yes that's entirely true, so that being said, there's a limit to how efficiently a program can run (the limit would be the program running at or below kernel-level). I'm no expert on such low-level tasks but I would think that what makes a program run more efficiently is the way the OS treats the program's instructions. Programs that run on linux do often run faster than they do in windows, but generally at an unnoticeable level. The noticeable part is how the program, and the OS itself, is more stable and the usage of multiple cores. There is something linux does that handles programs differently.
For example, I had an old opteron based computer that I overclocked from 2.4 to 2.7GHz. Running stress tests for hours, Linux ran fine. It didn't run perfectly, but it recovered from failures. Windows XP, however, would BSoD before I could start the test. Vista would BSoD during the install. But if I brought the clock speeds down to 2.6, Windows ran just fine.
As another note, linux has been x86-64 and ARM compatible for a very very long time. Today, it's almost stupid to not go for 64 bit. It is, IMO, the most complete 64-bit OS.
Ciber 14th April 2012, 11:41 Quote
This apathy is why a guy at my work has a PC that came with just 128meg RAM. It's only 5 years old but it basically doesn't work as a PC.
Pooeypants 14th April 2012, 19:06 Quote
What I find intriguing is that people aren't willing to pay 1000 quid for a top spec laptop but will pay the price for an entry level macbook. It's all about perspection; marketing makes all the difference. Doesn't matter about the specs or functions, people are generally dumb and just want shiny!
Xir 15th April 2012, 21:35 Quote
Quote:
Originally Posted by Sloth
I could see a spark of interest showing him Google Maps on my phone.
You're in the US...Try crossing a border with Google Maps on the phone :D
Don't get me wrong, it's great...as long as you don't switch to a country where your flatrate doesn't count. :(

(little tip...any other country, at least in our european contracts.)
Quote:
Sure, that is a lovely little netbook, but it won’t be any good for showing the videos of little Timmy’s first steps on will it?’.
Have you ever used one for what it was intended for?
Mine plays my video's just fine.
The new ones even have HDMI-out (which admittedly was a big bummer with the first ones)
Sloth 17th April 2012, 00:02 Quote
Quote:
Originally Posted by Xir
You're in the US...Try crossing a border with Google Maps on the phone :D
Don't get me wrong, it's great...as long as you don't switch to a country where your flatrate doesn't count. :(

(little tip...any other country, at least in our european contracts.)
Wait, there's more than just the US and Canada? :D

Jokes aside, I was showing it to one of the most technophobic and technologically illiterate people out there, the very idea of a hand held device which can display a map of the entire world and zoom in to specific places was a miracle in itself. His jaw almost literally dropped when I switched on the topo layer. All of the static mapping capabilities can be better covered by a dedicated GPS device without having to pay for a phone's data plan or dynamic data charges.
misterd77 17th April 2012, 04:48 Quote
im a big fan of buying 2nd hand high end, which you can get for silly prices, course you lose the warranty, and the shiny, but, you can get a desktop replacement for silly money, my current machine, a 17 inch samsung r720, cost £1k new, and has a dual core 2.4ghz cpu, 4 gig ram, 750gb hdd and a 46500hd discrete gpu, i managed to get it for £250, it manages to play most games smoothly, and I couldnt be happier...gumtree your next laptop purchase, you may be surprised at whats available...i was...
khossain565 21st April 2012, 11:44 Quote
, I was showing it to one of the most technophobic and technologically illiterate people out there, the very idea of a hand held device which can display a map of the entire world and zoom in to
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums