Tuesday, December 1, 2009

Ubuntu Touchpad Disable

Just as a note, i've also been having problems with my touchpad since installing Ubuntu 9.10.

With 9.04 i was able to get my disable touchpad hotkey working with minimal configuration ( i think i had to install gsynaptics), but in this version of Ubuntu it is not nearly as simple.

Apparently the change in hardware manager and lack of a global variable within Ubuntu. Several people have come up with fixes which usually involve some sort of a shell script, but i haven't been bothered actually implementing one. It would be good to see a fix actually committed to Karmic for this, but personally i don't have the experience with Ubuntu packaging and operating system structure to even know where to start with developing a solution.

Might possibly be a good idea for a project in the future.

Problems with Ubuntu + Skype

I recently upgraded to Ubuntu 9.10 Karmic Koala from 9.04 Jaunty Jackalope (a blog post in itself), and reinstalled all of my applications from scratch. As part of this process i had to download and install Skype for Linux.

This has always been a sore point for me as far as Linux is concerned. The version of Skype i used previously was deemed the "latest stable" version, but i constantly has problems. One example was with multiple conversation windows being spawned for a single conversation; i would say "hi" so someone, and their response would somehow be directed to another Skype window which i would also have to open. Another common occurrence was false status updates; a friend would come online, but Skype would still show them offline until i send them a message, at which time it finally updates their status.

This was all for a "stable" version.

About this time i found out a new beta version was being tested. I installed it, and then uninstalled it about 10 minutes later. The program which has been delivered to me was completely unacceptable by open source beta standards, let alone commercial standards. There were obvious glitches with text not updating, and many of the problems i had with the previous version still remained.

Most recently, when i updated to version 2.1.0.47, i found that overall Skype has improved slightly for Linux. Very slightly.

This version eliminates many of the bugs found in previous versions, although development seems to move at the same speed as a river of bricks; my guess is there is one poor Linux developer slaving away behind the scenes to iron out all the many wrinkles and creases in the current application, while trying to keep the features up to date with Skype for Windows.

the one major bug i can find is that since installing this new version of skype under Ubuntu 9.04 every now and then my input devices are paralysed: i can only type text into certain windows, and with limited success. Sometimes alt-tab works, somtimes not. About the only way i've found of seemingly solving the problem is switching between the ctrl-alt-f1 terminal and gnome desktop repeatedly until i regain control over the desktop. Even if the restart-gdm shortcut was still active in Ubuntu i wouldn't want to be restarting it every 10 minutes.

I'm not even entirely sure if this is a problem with Skype, but i suspect it is because of the history i have with Skype, and because focusing the Skype chat window usually seems to set off this bug.

I would be interested to know if anyone else has had similar experiences with Ubuntu 9.10/Skype, and if anyone knows a more permanent fix?

Wednesday, October 21, 2009

The PC: changing roles

This isn't a new observation, nor am i the first to point it out. If we look at the Technology industry recently, it becomes obvious the role of the PC in people's lives has changed markedly over the last few years.

First i will mention PC gaming, a topic very dear to my heart. It is my opinion that the demise of the PC as the standard gaming platform was partially a result of console hardware coming in well below the cost of a gaming PC. As a student living off next to nothing, making the most of my money is very important. With a gaming PC i was forced to upgrade something every 6 months in order to keep up with game requirements. Then every 2-3 years came the obligatory upgrade as new technology rolled in the door. But it was somehow worth it. Why? Because PC display technology was far ahead of console display technology. Previous consoles ran on very low resolution PAL or NTSC displays, and had generally used a 1-chip-does-it-all approach. Comparatively, computers at the time were already running resolutions of 1024x768 or higher (much higher for an exclusive elite in the gaming community) with dedicated graphics processors. Modern consoles run resolutions as high as most HDTVs or computer monitors will allow, with little or no lag. From a development perspective consoles are unified standard platforms, reducing the amount of time spent testing different hardware configurations and fixing subsequent bugs.

As computers have hit the speed barrier, the technology in hand-held devices has continued to surge onward. Cell phones can now act as mobile computers, with internet connectivity and fairly modest processing power. Netbooks have also helped bridge a gap in mobile computing, and MID devices look to soon do the same. The mobile phone and computer paradigms are merging into one central device which is both a computer and a communications platform.

In the business space, Google has transformed the internet with their server farm designs. Suddenly large scale computing power is achieved with large networks of PCs, where previously only a mainframe system would have been suitable or viable. Supercomputers are also being affected, with many new systems being built from standard PC CPUs and GPUs.

The current period of computing spells out a interesting crossroad for the PC and its role in computing; the number of computing devices available to us is increasing, each with shifting roles in our everyday lives. At the same time, these devices are becoming far more convenient and cost-effective than the PC. What the future holds, no one knows?

Saturday, September 12, 2009

Iterating through multiple objects at once in python

As part of my project coding some image processing using OpenCV/Python I needed to iterate through two images at once (the images have identical dimensions). Please note that for this i'm using the OpenCV 1.1pre1 build and Python 2.6. After a quick Google search i found this page which details the various methods for iterating through multiple objects at once in python.

I used the "zipping" method; this basically takes both iteratable objects and joins one sequential item from each iterable object together to form a tuple. This tuple is then added to a list which contains all the object pairs. The zip will stop at the end of the shortest iterable object (so the final list will be as long as the shortest object).

Monday, July 13, 2009

The Journey Continues...

...into the realm of C/C++.

I have been absent for a while (again) while i try and get uni under control. My second-to-last semester seems to have blown by in a great hiss and roar, along with the break between semesters (which i can't really say was much of a "break" for me). Oh well, so Uni goes. Only one more semester to go.

Recently i've been getting up to my neck in various image processing projects, and most recently i decided to attempt using OpenCV for my latest assignment since i was going to use it for my project anyway. Yes, i would rather code in C/C++ than Matlab. Not to mention all the tools i am using are free. The one bit of proprietry software i'm using is Microsoft Visual Studio 2008 Professional edition, not for the .NET libraries and classes, but simply because it's easier to import OpenCV and cvBlobsLib into (because these libraries are written in Visual C++). Oh and also because the company i am doing my project for is working in Visual Studio. And technically this is still a freebie because i recieved a copy courtesy of the MSDNAA (Acedemic Alliance) program. Otherwise i would probably just work in Eclipse, even if i had to manually build the libraries from source.

OpenCV itself will appear kind of hideous to someone like myself who hasn't had much experience with non object-oriented programming. It's easy enough to use though, and makes the whole process as painless as possible. OpenCV includes libraries for just about everything imaginable; from basic array operations to filtering and segmentation, to object recognition and motion tracking. It really is the most versatile library you will find for image processing. It was not complete though. Naturally as it is, OpenCV lacks blob labelling and feature measurement/generation.

Enter cvBlobsLib, a complement to the base OpenCV library which basically adds all of the above. It even has Matlab-like profiles for blobs, including features such as the convex hull, ellipse best fit, max pixel dimentions, area, perimeter, etc.

The final library i will mention is actually a set of libraries i stumbled upon while looking for a way to iterate through the files in a directory (and the C/C++ standard libraries currently have no directory/file lookup functionality). Boost is a free library collection which adds alot of much-needed functionality to C++. Not only does it do this, but it does it for free, and is compatible across most (if not all) operating systems, courtesy of the POSIX interoperability standard. In lamens terms, you can use this set of libraries on Windows, Linux, MacOS, and even some other less well-known operating systems without any major issues.

Now hopefully, with this combination of free libraries tacked together i will be able to go full speed ahead with both my project and assignment.

Thursday, June 11, 2009

Windows 7 RC

Halfway through the month and i finally managed to squeeze in time for a blog post...

For the last couple of days i've been playing around with the Windows 7 RC, so i thought i would share my impressions. First though, to compare it with the Beta version. I'm relieved this time it actually included basically all the drivers i needed to get going. When i installed the beta i found that some of my hardware hadn't been installed, including my all-important WiFi drivers. This caused me to prettymuch immediately abandon testing (64 bit drivers aren't available for my laptop from the manufacturer :-/).

This time the experience got off to a smooth start. Windows basically installed itself, without requiring me to spend half an hour specifying configuration options (like the classic XP or Ubuntu installs). The only drivers i had to install were my laptop keyboard for the special function keys, synaptics driver (so my touchpad disables when i plug in my external mouse), and the NVidia Video driver. Most of these were installed from the vista 64 versions.

Ok so the first thing anyone will notice in 7 is the taskbar at the bottom of the screen. It has several fancy features like Aero peek to augment it, but the taskbar in itself has been heavily overhauled. As anyone who's tried windows 7 will notice, applications running only have icons displayed, but because of the increased size of the taskbar and the quality of most modern icons, this isn't such a problem. Additionally, instead of having a "Quick Launch" bar slapped on there, you can basically pin applications to the taskbar. There are some additional niceties like when i was downloading something with internet explorer, the taskbar icon pane for IE showed the curent download progress. All up, i think it is a definite improvement to the classic windows desktop, although it should still be familiar enough (especialy to vista users) that it won't scare users off.

On that note, i remember thinking when XP came out that it's completely off-the-wall look would drive people batty, and it did for a while. As people were forced to come to terms with it at work and bought new PCs with it, slowly the interface became more acceptable. I think there will be a similar acceptance phase with Windows 7.

Let's talk performance. Windows 7 is much faster than Vista (although i haven't had a chance to try SP2), and tends to chew far less resources. This would probably change over time, as with all operating systems, but so far it's snappy, responsive, and beautiful. It still can't match up against Linux systems for memory performance (although probably does better on idle CPU usage), but all it's Vista-like chaching of program startup files does provide an advantage when applications start in a snap. Ubuntu or XP meanwhile leave droves of memory unused. Good for if you want to start a hefty app like a VM or game that needs in excess of 1GB, but now i can recognise why Microsoft chose to make use of it, rather than let it go to waste. That said, the Superfetch seems alot less agressive in this version of Windows compared to Vista. Vista was ever swapping away with my hard drive (probably driving it to an early grave). Windows 7 on the other hand will do some swapping when applications open and close, and memory is fluxuating wildly, but it reaches a steady state; a point where it either just casually ticks away at the hard drive or sits idle.

It makes a change from my Ubuntu desktop. Although it is shiny, at the moment i'm still tied to Ubuntu because of applications and research i have saved on there, and i don't want to get tied to this partition because it is only a time-limited RC (although i format my computer ~6 months anyway). I highly reccomend Widnows 7 to any current Windows user, whether you're using XP or Vista. Although on that note, i still reccomend you have 2GB of ram. I don't think the requirements for Windows 7 are that much lower than Vista, whatever Microsoft claims. On my laptop with 2GB of ram, 7 runs very comfortably, with some room to stretch it's legs.

Wednesday, May 20, 2009

How to enable 3D acceleration in VirtualBox

Well today i was just messing around with my VM, working on an assignment, and i discovered (finally) what i had missed when trying to activate 3D acceleration.

At the time i assumed it was an automatic feature, that as long as you had the latest version of VirtualBox and the guest additions it would run. This turned out to be false. I found a check-box hiding in the Virtual Machine's settings which enabled the 3D acceleration and eliminated the messages i was getting about the video adapter not having any 3D acceleration. Here's a screenie of the offending checkbox:
Check this little beauty, have the VirtualBox guest additions installed, and you're good to go. Note that VBox's Mouse Integration (MI) feature seems to cause the mouse to play up in games, so i found turning MI off during games was the most effective way of solving the problem.

The performance is pretty mediocre, as you would expect, but everything works for me (apart from the mouse bug). Here's a screenshot of one of my all-time favourite games to prove it, running in VBox (Freelancer):
The rest of the process i hope i explained well enough in my previous post, but if not just drop me a comment.

Wednesday, May 13, 2009

Finally got my touchpad working

One of the things that has definitely been bugging me for the last few months while i've been using Ubuntu is the non-operation of my touchpad enable/disable button. I didn't know why it wasn't working (i just assumed it would, as you do with linux), and i couldn't find any easy soltion to make it work. Until today.

Was searching for information on SHMConfig (the key to advanced touchpad management) today and managd to stumble across a very good help article on the Ubuntu Website, dealing with Enabling SHMConfig the safe and easy way. Low and behold, once i had done this step and rebooted, my laptop's touchpad enable/disable key works like a charm. Success!

Tuesday, May 12, 2009

Virtualbox and Windows Direct3D/DirectX

As i already mentioned in a previous post, lately i've been quite a keen user of Ubuntu Linux for the last few months, and as part of my experience i've been running an XP VM on my Ubuntu install to supplement WINE and make up for the areas it lacks in (such as XP/Windows development tools). For my VM i've been Using VirtualBox OSE (open source edition), which provided me with all of the functionality i needed, and was handily available on the Ubuntu repositories.

The Downside to VirtualBox initially was it's lack of 3D acceleration support; originally it didn't allow any, and one release after i started using it they aded openGL support for linux. Well i just checked it again after the most recent update, and it seems Direct3D (directX) support has indeed finally been added (Yay, games).

Note at this stage DirectX 10 support is still marked as experiemental, so if you were going to try and run a Vista or Windows 7 install with DirectX10, you might still have a little longer to wait. I tried getting the DirectX support working; everything from installing the latest guess additions to updating my VM. Kept getting a message saying that the graphics device does not support Direct3D. For the moment, i give up. Maybe i'll have more luck with future versions down the track.

Friday, May 8, 2009

Smells like change

Just to top it off for today, i stumble across another article speculating that the industry is in a somewhat evolving phase.

"Computing, on about a 10-15 year cycle completely reinvents itself," said Dr Andrew Herbert, Managing Director of Microsoft Research Cambridge in a keynote speech today attended by TechRadar. "Obviously the PC itself was one of those waves, the internet and web another. Those things are crucially dependent on software...that's the bread and butter [of what we do here]."

Again surprising that it's coming from Microsoft. they obviously know something we don't?

I must have been onto something....

Just a quick update.

In my last post i talked about how software must be at some stage converging to a point where improvement can no longer be made. This point usually occurs about the time of a major paradigm shift in the associated software.

I was very interested to note an article on CNET news that Steve Ballmer had apparently outlines something similar in his recent speech at Standford University in the states "The problem with software wasn't that people didn't want computers, though. The challenge, Ballmer said later in his chat, is that software doesn't wear out, meaning companies have to do something new and different to get people to upgrade."

The way the industry is going at the moment, there's several things that could erupt into potential paradigm shifts. For one, the NetBook market is still a race everyone's trying to enter. Other technologies such as touchscreens, zcams, and the constant evolution of computing hardware (mainly in it's growing complexity) all have the capability to be game-changing. Or the current paradigm could hold, the bubbles around some/all of these technologies could burst, and we could continue along the current status quo for a few more years.

The same is also happenning with the auto industry; the speed at which electric and hybrid vehicles are becoming feasable is dizzying. But with several reports claiming the infrastructure is not ready to support it, and with no real consumer buy-in yet, no one knows what will happen when these vehicles hit the market.

Wednesday, May 6, 2009

Wow it's been a while.

Hard to think it's been almost a year since i set foot in here... *tumbleweed*

I've been hard at work though, summer saw me working on the touchscreen software for the Dell Studio One 19, as well as the touchscreen tables for NZ@Cebit 09. Quite an exciting summer considering it all happened in tiny little Palmerston North.

Since it's been so long since i was last here, i thought i would take the chance to reflect on the last year, and what has happened in that time (and see where i go from there).

About this time last year, i was speculating about the property market, and that fact that over-inflated prices would imminently come down, as we could all afford to wait for it to happen. I didn't even know then how right i was, but at the same time i never could have predicted the fallout from various financial fallouts all over the world, largely due to the short-sightedness and profit-focus of banks and lenders.

I am well into my fourth (and last!) year of study at Massey University now, thankfully. Only a few hard months, and i will be set loose upon the world (look out, world). In the meantime my free time consists of 10-minute breaks between classes, or periods of time where i just sit in my chair, dazed by the sledgehammer-like density and suddenness of work this semester. Currently sitting on top of 6 assignments. But of course if it was easy, everyone would have a BE majoring in Electronics and Computer Systems Engineering.

One of the significant changes in the last few months has been my adoption of Linux on my laptop, and open source software in general. As a sort of personal experiment, i installed Ubuntu 8.10 to see how Ubuntu was doing. Since then i have upgraded to the new and shiny Ubuntu 9.04 Jaunty Jackalope. Linux really is a different animal to Microsoft Windows. I definately wouldn't bestow it with the label of easier to use, just yet. However, where it excels is an extremely small resource footprint, and a non-existent cost footprint. By small footprint i mean it consumes about 600MB of my 2GB of RAM when running OpenOffice, Firefox, and a few other apps (such as Skype and Pidgin). This makes it feel alot more responsive, because it's not constantly swapping off the hard drive, like Vista (which consumes the entirety of my memory, plus 1.3GB of swapfile). Once it's up and running it's just as easy to use as Windows, and sports a wide range of completely free software, usually from Sun, GNU, or other open-source supporters. But you pay for this in the initial setup (which is getting better), and the immaturity of some of the open source software, compared to commercial equivalents.

Windows has been kind of stagnant since Vista, i guess since it was such a dissapointment. Fortunately, this should change (at least a little) with the release of Windows 7 later this year. I guess what will really affect it will be whether Microsoft manage to make the transition from XP smooth enough, particularily on older computers and slower hardware. Vista users are practically queueing up to get a hold of 7, so i don't think there's any doubt that it will score that part of the market.

One thing that i have been thinking about, mostly about software in general; is there a saturation point for the complexity of conventional computer software? Is there a point where, if the paradigm doesn't shift in any major way, software fully satisfys the requirements people have of it. Some will argue that software is still making huge advances, and that plenty of new applications are coming out, but i point mainly to older software. As examples, i tend to look mostly at some of the oldest pieces of software for this "age"; DesktopOperating Systems and Office Suites. Looking at windows, i think the software really met all (or almost all) user needs at XP. After switching to it and using it for a few years, people really got comfortable with it. This created the problem that Microsoft now face, trying to wrench it from the hands of an experienced and devoted following. Since XP, the upgrades to the system have felt more like separate addons, than any contribution to the actual operating system core. This could even go back to 2000, the first mainstream MS Windows to run on the NT kernel. Likewise with office programs; it feels like Microsoft's attempts to improve usability and the "look-n-feel" of Office are attempts to cover for a lack of groundbreaking new features.

This compares to years ago, when the addition of a new operating system meant huge features like the USB interface, and 3D gaming, the NTFS file system. These additions and technological benefits seem to be mostly shifting towards specialised devices, which can be better controlled, policed and protected (such as Apple's much-coveted iPhone). I think this is also, in part, due to the decrease in size of the PC gaming community. In it's hay-day, one could spend huge sums of money of yearly upgrades to keep the gaming rig up to spec, and there was no option that could match the experience. With Xbox Live and the PSN, this has changed completely. Game developers have also moved to consoles, due to better development tools, a smaller testing pool (one set of pre-configured hardware requirements), and better anti-copy protection due to tightly regulated platform. The thing that really is killing PC gaming is that this hardware comes at a pittance compared to the PC gaming hardware. A PC would require at least $400-600 a year in upgrades; adding memory, replacing components. And then there's the bi-yearly or tri-yearly building of a new gaming rig, after the old one was depreciated into the ground.

One of the new features upcoming Windows 7 is boasting about is improved touchscreen support; in particular, native multitouch support. This is an interesting addition to the operating system, and might lead to some interesting development for the touch-screen all-in-ones out there, but also could fail to be of use. My view of touchscreen up until this point has been idealist; that touchscreens will change the world. But now the technology sits at the precipice, and only time can tell whether it will prove to change the way we use computers. One major factor in the dilution of my excitement over recent developments in touchscreens is the apparent upcoming z-depth camera sensors, which claim to deliver the same experience as a touchscreen, but remotely, and with a far greater range of input options available outside of the 2d plane.

I've sort of digressed a bit; my point was originally the stagnation of computer software reaching what i would refer to as it's maturity phase in the market (as marketing/business people may be familiar with). But in exploring this matter, i suppose i have identified that the slowdown in computers, particularly at the high-end gaming section, is largely due to the migration of gamers from PC to console. The exception to this is World of Warcraft (it will eat your soul!!!). The point of this exploration is to come to the conclusion that if software does really come into a maturity and decline phase as with other products, doesn't it make sense to make software in such a phase open source, and relegate it's maintenance and management to the community of users? I know of many cases, particularly in gaming, where software isn't even sold on the shelves any more, but for some reason the developers continue to hold it as intellectual property. This is where the open source software movement has caught up to microsoft in some respects; it moves much slower, avoiding the minefields that microsoft tend to wade through. But if the software does come to maturity, then eventually the slow-moving open source software should catch up to commercial alternatives?

This is of course all theory. Software still seems to be making reasonable advances, and i haven't yet heard of a market where open and free software has overtaken proprietry equivalents. There are some case of open source (but not free) software has been successful as a business strategy, but the truly free software still advances onwards at the speed of a treacle pudding.

Ideas have been thin on the ground for me the last few months, mainly due to my complete and utter lack of time to devote to them. I did manage to come up with some interesting ideas for projects during the summer holidays, but even then i didn't get anywhere with them. Hopefully once uni is over and i'm working full time, i will be able to find some time for personal projects, maybe even a preview of Galaxy (after some more development actually gets done).

I'll try and make my next update a little more prompt.