Archive for the ‘IT’ category

L.A. to go to cloud-based email

October 28, 2009

The city of Los Angeles is moving their email (for 30,000 employees) to Google.  This will be a good viability test for the system.  I know they’re mainly concerned about security, but I think it will also be a good test of uptime and customer service.

I think it will be a good practice run for Google.  If they can support this many users with business-level security and uptime needs, it will be a good starting point for selling the same service to other government organizations.

It should be noted, before we get too excited, that this is EMAIL.  Not file servers, not thin client desktops linked to a cloud-based array, just email.  It’s a fitting place for L.A. to start, since cloud-based email has been around for a long time already.  We’ll see if they migrate more areas over to cloud-based solutions as time goes on.

Here’s the L.A. Times article.

My experience with Active@ Boot Disk

October 19, 2009

Our office recently needed to do some data recovery.  We usually don’t have to, since all saved data is supposed to go onto a network share.  But there’s always the user who thinks they know better and keeps 3 GB of data on their desktop (not just the desktop computer, but the Windows desktop).  And of course, this would also be the user whose hard disk dies.  My sense of responsibility got the better of my schadenfreude, so I started trying to retrieve the data.

Occasionally, I’ve been able to retrieve data by fiddling with things until I can temporarily access the data.  Didn’t work this time.  Enter Active@ Boot Disk.  I’ve used the demo DOS-based version of their software before, and I wasn’t particularly impressed.  But we decided to spring for the Windows-based version, and I’m glad we did.  It’s intuitive, effective, and I will consequently spend much less time recovering files in the future.  Check it out if you have a chance.

Some of the particular things that impressed me about the Win version are:

– ability to map a network drive, then save recovered files to it
– use Remote Desktop to control another computer
–  reset Windows passwords (I already have a freeware tool that does this, but it’s nice to have)
– registry editor
– web browser, mail sender, FTP and Telnet clients
– disk wipe tools
– partition tools
– imaging tool
– hex editor to read data directly from the disk (I don’t know if I’ll ever use that, but hey…)

So you get the idea – it’s got a lot to offer.  If you’re needing a preinstall environment, especially for doing file recovery, check it out.

Only thing I wish it had – ability to integrate malware scanners into the PE.  Oh well, can’t have everything.  And at $80 US, it’s a great value.

The future of IT is Big

October 13, 2009

The New York Times is running an interesting piece about the ever growing glut of data. The article details IBM and Google’s concern over the data glut and if new and upcoming students trained to handle the explosion of data. It is quite a fascinating piece.

At the heart of this criticism is data. Researchers and workers in fields as diverse as bio-technology, astronomy and computer science will soon find themselves overwhelmed with information. Better telescopes and genome sequencers are as much to blame for this data glut as are faster computers and bigger hard drives.

Please click through and read the whole article. It is very good and very true. This topic should be at the forefront of any person who works in the Computer/Technology field. First there is the problem of how to store this much data. Currently I work for a small publisher (O’Reilly media). It is easy to think that a small publisher probably doesn’t have huge storage needs. But so far since I’ve started working here (1 full year going on my second) we just ordered our second storage shelf, this time for almost 14TB. The new shelf has yet to be installed, but the other day my IT coworker was talking to management in a meeting. Our last shelf was around 1TB, but lasted less than a year. He said at almost 14TB this should last us a long time, but then added, “But we say this every time.” It is so true, especially with storage so cheap and drives so big. It reminds me of my first computer in the mid 90’s with 10GB of storage. I told my parents I’d never needed a bigger hard drive. Then I went away for my freshmen year of college and filled it right up with stupid pictures and movie files.

When I worked for the University of Illinois Engineering department the problems were worse. One research group that I worked for had 1 professor and maybe 5 students (including undergrad). They were relatively new so there was no infrastructure or file server and there really wasn’t much money for it anyway. One day I went to the Professor’s office. He must have had at least 30 hard drives each at least 500GB if not 1TB. Those were just the hard drives he had his students carried around a handful themselves. Another research group, with decades of history, started a scanning project. They would scan hundreds of slides at once each producing around 1MB of data. We installed a file array starting off at 4TB, but was expandable to 14. Unfortunately I left and am not sure what they have or need now. My point is that data storage is a huge problem. And is growing extremely fast. The article mentioned facebook’s 1Petabyte of photos, I’m guilty of quite a few of those, but that is just mentioning one company, many more could have been mentioned. Finally there is even personal space. Since I got my new camera I myself am looking at more storage for home. I am looking for personal NAS boxes. So I see the basic point. The future of IT is data and what to do with it.

Computer scientists and, for that matter, any scientists need to pay special attention. Not only do we need a way to store a lot of this data, but probably more importantly we need to do something with it. A lot of this will rest on programmers, but it isn’t limited to them. When I worked at the U of I the students worked on a cluster I built for them. They would code in C tweaking their algorithm to save every last processor cycle. These students weren’t in Computer Science. This summer I took a course at Boston University. One of my classmates was clearly not a computer person. I asked her why she took the course. She was a statistician and was heading to Grad School for statistics. The school asked her to take programming courses so she could analyze data sets. And of course then there are the Computer Scientists, and our future depends upon analyzing such data.

The future is big data; lots of it. And it is no longer just Google and IBM analyzing and storing it. Now even the smallest of research groups or a little publisher can generating mounds of information. Time to start paying very close attention.

My change of heart

September 30, 2009

Ten Reasons Why Windows XP Will Be Around For A While.  I saw it this morning.  http://blogs.techrepublic.com.com/10things/?p=1045

It’s a good list, and a point of view I would have religiously followed until very recently.  So I’m coming out – I’m not so sure about keeping XP anymore.

There, I said it. =) My reasoning before was something like this: “It works fine, and the replacement is junk, so why upgrade?”  But I think that with W7, there will not be the same reasons to keep XP around.  Here’s a few factors that contribute to my thesis.

1) Hardware standards have come a long way since 2002.  A small example?  DX10.  Yes, I know if you’re not a gamer, you might not care, but it’s just an example.  I personally think it’s time to up the ante on the OS and make fuller use of the newer hardware available.

2) That new hardware?  It’s getting cheaper.  One of the big complaints about Vista was that people had to spend a lot on new hardware to run it.  But in the end, this might have been good for the computing world, because it pushed people to get rid of the dinosaur in their basement.  But now, you can get an amazing desktop system for $500, or a notebook for $650.  The hardware isn’t that expensive at this point.

3) Windows 7.  It looks like it’s going to be to Vista what 2K/XP was to ME .  On top of that, one of the very points in today’s TR article was that XP will stay around because W7 includes a virtual version of it.  Huh?  I can see where he’s pointing (that companies will stick with XP software because of the virtual option) but I think it will go the other direction.  I think yes, we’ll keep some (not all) of our XP software, but I think companies will eventually ditch XP in favor of W7’s virtual option.  And from an IT standpoint, the end is obviously near, and MS will either have to revamp their entire compatibility strategy (way outside the box) or we’ll have to figure out how to make newer software.  But for now, I think the virtual XP will facilitate a move to 7, rather than hinder one.

I will say this.  Microsoft hit a gold mine with XP.  They should take it as a compliment that they themselves haven’t been able to top it yet.  But they will need to eventually if they want to keep market share.

A lesson in IT planning

September 18, 2009

The cliche goes: “Fail to plan, and you plan to fail.”  I recently have had a unique opportunity to use both good and not-so-good planning in my role as a technology consultant.

A bit of background if you haven’t read my other posts.  I work day shift as an LII desktop tech for a medium-sized corporation.  In my spare time I consult for local small businesses, helping them plan, implement, and maintain their IT systems.  I have a relatively selective client base that mainly divides up between 1) malware and performance issues for individuals, and 2) small business IT services for several local firms.

Two of my major clients are realty offices, each employing several sales professionals as well as ancillary staff (real estate lawyer, receptionist, etc).  In fact, they are direct competitors in our small-town real estate market, and I rather enjoy seeing them work hard to ‘outdo’ each other.  In the course of their battle for market share, they both decided that a new location would serve better.

Client 1 has a wireless/wired network with some 20 data jacks throughout the small office (as many as 4 in one 12×16 office!).  They use a LAN-enabled printer and have a special-use computer acting as a file server.

Client 2 has a wireless only network – the exceptions being a printer and the receptionist’s PC (physically located at the router).  Everything else is wireless because they primarily use laptops.  They share files on the receptionist’s PC.

Oh, the joys of zero-config wireless.  I spent a mere hour onsite setting up Client 2.  Of course, I have worked on their systems before and made sure that their file/print sharing was working cleanly.  I also had a little bit more time into that project due to the pre-project meeting for a total 1.5 hours charged.

Client 1, however, I invested some three or four hours in over the course of a week.  Patch cables had not been ordered (this is my fault) because we had not discussed the network’s physical setup thoroughly enough prior to the move.  In addition, we almost bought a new switch for them due to the fact they didn’t know they had a 24-port sitting in the basement of the old building.  We ordered the cables (rather pricey due to how many jacks they had installed) and I installed and configured the network.

Here’s my analysis of my planning.  Poor planning on my part caused the wait time for Client 2, but the difference is small when it comes to the final time charged (I might have saved a half hour if I hadn’t made that mistake).  Still, I need to learn from that.  The other factors in this situation mostly involved the owner not being organized about what was wanted beforehand (again, I could have proactively helped that), and asking me to fix PCs when I was there to help them move (let’s focus on one issue at a time!).  I do enjoy working with that client, but it gave me a heads-up that I need to plan better and be more assertive with that particular client.

So then a few months later I move Client 2.  He knew exactly what he wanted.  This was such a big plus.  He worked with the telco to make sure the needed jacks were hot prior to my arrival.  Our short planning meeting was to the point and productive.  When I got there the day of the move, there was little extraneous work to be done other than get the receptionist’s desk put together so I could put the monitor on it.  Everything went together without a hitch.  Set up computer, test printing and network, then test laptops on the network.  There was a small additional piece of work – to set up network printing for a user that had not previously used the workgroup printer.  However, this did fit with the scope of the project since network printing was a big part of testing the new setup.

I would point out that I am not entirely sure about Client 2’s decision (made independent of me) to only have 2 network jacks in his entire suite (large main room plus two side offices).  The building’s physical setup is such that installing new jacks would not be difficult, but I would have put two jacks in each room anyhow.  But we’ll see as time goes on – he’s certainly future-minded with using a primarily wireless network.

My takeaway?  As a consultant, make sure I know not only the network’s logical topology, but the physical topology as well.  Plan the planning meetings ahead of time so that I know what I’m looking for.  Make sure to stick to the scope of the project at hand, and defer noncritical tech support requests until after the project itself is finished.  And educate my clients where appropriate so that they can more fully communicate their needs.  We’re here for them, after all.

Microsoft to give preview of Office Web

September 18, 2009

I am curious to see what good (and bad) things the Web App version of Office will hold.  Obviously, not all features will be included.

I’m not sure what to think.  I feel like Microsoft products are thoroughly ingrained in the business world, and we’re not moving away from them anytime soon.  This is OK by me.  For all their problems, they tend to be predictable and generally reliable.

Check out a bit more rundown on the upcoming testing here.

In the above link, one commenter alleged that Office Web would be useful if you had to show a presentation but didn’t have Office (and did have Internet).  I might point out that MS already covered that base by providing free viewer software for Office.  I think they’re really trying to move production work with Office into a browser.  Not just for viewing – they want people to work in the browser.  We’ll see how it catches on.

“Flash cookies” are the new privacy offenders

September 8, 2009

Ever heard of an LSO?  A Local Shared Object is similar in many ways to a typical HTTP cookie, but it’s used with Flash instead of HTTP.

In case you’re not up on the subject, a cookie is a 4KB text file that is stored on your computer.  When used by ethical developers, it’s a fairly innocuous way to make your browsing experience more convenient.  They’re responsible for remembering your Gmail password, your address that auto-fills on the electric company’s website, etc.  They’re a useful way to keep information around in a relatively secure manner.

There are some significant privacy concerns with cookies, though, as marketers quickly found a way to abuse them.  Enter third-party cookies.  But even with those concerns, you can set your browser to reject third-party cookies.  Or all cookies, for that matter.

However, with LSOs, many users don’t even know they exist.  And unlike your vanilla 4KB cookie, LSO’s can store 100K of information.  Doesn’t sound much, but in plain text, that’s a whole lot of information about your browsing habits.  Like HTTP cookies, LSOs are domain-specific (that is, an LSO can only be read by machines on the domain that created the LSO).

So the big concern with LSOs is this: many users think their privacy is secure when they turn off cookies.  It’s not, because LSOs are cookies but are not controlled by your browser – they’re controlled by Adobe software.

LSOs are turned on by default.  You can find information on managing (read: turning off) LSOs on Adobe’s website here.

Are LSOs a concern to you?  Why or why not?