Wednesday, May 20, 2009

How to enable 3D acceleration in VirtualBox

Well today i was just messing around with my VM, working on an assignment, and i discovered (finally) what i had missed when trying to activate 3D acceleration.

At the time i assumed it was an automatic feature, that as long as you had the latest version of VirtualBox and the guest additions it would run. This turned out to be false. I found a check-box hiding in the Virtual Machine's settings which enabled the 3D acceleration and eliminated the messages i was getting about the video adapter not having any 3D acceleration. Here's a screenie of the offending checkbox:
Check this little beauty, have the VirtualBox guest additions installed, and you're good to go. Note that VBox's Mouse Integration (MI) feature seems to cause the mouse to play up in games, so i found turning MI off during games was the most effective way of solving the problem.

The performance is pretty mediocre, as you would expect, but everything works for me (apart from the mouse bug). Here's a screenshot of one of my all-time favourite games to prove it, running in VBox (Freelancer):
The rest of the process i hope i explained well enough in my previous post, but if not just drop me a comment.

Wednesday, May 13, 2009

Finally got my touchpad working

One of the things that has definitely been bugging me for the last few months while i've been using Ubuntu is the non-operation of my touchpad enable/disable button. I didn't know why it wasn't working (i just assumed it would, as you do with linux), and i couldn't find any easy soltion to make it work. Until today.

Was searching for information on SHMConfig (the key to advanced touchpad management) today and managd to stumble across a very good help article on the Ubuntu Website, dealing with Enabling SHMConfig the safe and easy way. Low and behold, once i had done this step and rebooted, my laptop's touchpad enable/disable key works like a charm. Success!

Tuesday, May 12, 2009

Virtualbox and Windows Direct3D/DirectX

As i already mentioned in a previous post, lately i've been quite a keen user of Ubuntu Linux for the last few months, and as part of my experience i've been running an XP VM on my Ubuntu install to supplement WINE and make up for the areas it lacks in (such as XP/Windows development tools). For my VM i've been Using VirtualBox OSE (open source edition), which provided me with all of the functionality i needed, and was handily available on the Ubuntu repositories.

The Downside to VirtualBox initially was it's lack of 3D acceleration support; originally it didn't allow any, and one release after i started using it they aded openGL support for linux. Well i just checked it again after the most recent update, and it seems Direct3D (directX) support has indeed finally been added (Yay, games).

Note at this stage DirectX 10 support is still marked as experiemental, so if you were going to try and run a Vista or Windows 7 install with DirectX10, you might still have a little longer to wait. I tried getting the DirectX support working; everything from installing the latest guess additions to updating my VM. Kept getting a message saying that the graphics device does not support Direct3D. For the moment, i give up. Maybe i'll have more luck with future versions down the track.

Friday, May 8, 2009

Smells like change

Just to top it off for today, i stumble across another article speculating that the industry is in a somewhat evolving phase.

"Computing, on about a 10-15 year cycle completely reinvents itself," said Dr Andrew Herbert, Managing Director of Microsoft Research Cambridge in a keynote speech today attended by TechRadar. "Obviously the PC itself was one of those waves, the internet and web another. Those things are crucially dependent on software...that's the bread and butter [of what we do here]."

Again surprising that it's coming from Microsoft. they obviously know something we don't?

I must have been onto something....

Just a quick update.

In my last post i talked about how software must be at some stage converging to a point where improvement can no longer be made. This point usually occurs about the time of a major paradigm shift in the associated software.

I was very interested to note an article on CNET news that Steve Ballmer had apparently outlines something similar in his recent speech at Standford University in the states "The problem with software wasn't that people didn't want computers, though. The challenge, Ballmer said later in his chat, is that software doesn't wear out, meaning companies have to do something new and different to get people to upgrade."

The way the industry is going at the moment, there's several things that could erupt into potential paradigm shifts. For one, the NetBook market is still a race everyone's trying to enter. Other technologies such as touchscreens, zcams, and the constant evolution of computing hardware (mainly in it's growing complexity) all have the capability to be game-changing. Or the current paradigm could hold, the bubbles around some/all of these technologies could burst, and we could continue along the current status quo for a few more years.

The same is also happenning with the auto industry; the speed at which electric and hybrid vehicles are becoming feasable is dizzying. But with several reports claiming the infrastructure is not ready to support it, and with no real consumer buy-in yet, no one knows what will happen when these vehicles hit the market.

Wednesday, May 6, 2009

Wow it's been a while.

Hard to think it's been almost a year since i set foot in here... *tumbleweed*

I've been hard at work though, summer saw me working on the touchscreen software for the Dell Studio One 19, as well as the touchscreen tables for NZ@Cebit 09. Quite an exciting summer considering it all happened in tiny little Palmerston North.

Since it's been so long since i was last here, i thought i would take the chance to reflect on the last year, and what has happened in that time (and see where i go from there).

About this time last year, i was speculating about the property market, and that fact that over-inflated prices would imminently come down, as we could all afford to wait for it to happen. I didn't even know then how right i was, but at the same time i never could have predicted the fallout from various financial fallouts all over the world, largely due to the short-sightedness and profit-focus of banks and lenders.

I am well into my fourth (and last!) year of study at Massey University now, thankfully. Only a few hard months, and i will be set loose upon the world (look out, world). In the meantime my free time consists of 10-minute breaks between classes, or periods of time where i just sit in my chair, dazed by the sledgehammer-like density and suddenness of work this semester. Currently sitting on top of 6 assignments. But of course if it was easy, everyone would have a BE majoring in Electronics and Computer Systems Engineering.

One of the significant changes in the last few months has been my adoption of Linux on my laptop, and open source software in general. As a sort of personal experiment, i installed Ubuntu 8.10 to see how Ubuntu was doing. Since then i have upgraded to the new and shiny Ubuntu 9.04 Jaunty Jackalope. Linux really is a different animal to Microsoft Windows. I definately wouldn't bestow it with the label of easier to use, just yet. However, where it excels is an extremely small resource footprint, and a non-existent cost footprint. By small footprint i mean it consumes about 600MB of my 2GB of RAM when running OpenOffice, Firefox, and a few other apps (such as Skype and Pidgin). This makes it feel alot more responsive, because it's not constantly swapping off the hard drive, like Vista (which consumes the entirety of my memory, plus 1.3GB of swapfile). Once it's up and running it's just as easy to use as Windows, and sports a wide range of completely free software, usually from Sun, GNU, or other open-source supporters. But you pay for this in the initial setup (which is getting better), and the immaturity of some of the open source software, compared to commercial equivalents.

Windows has been kind of stagnant since Vista, i guess since it was such a dissapointment. Fortunately, this should change (at least a little) with the release of Windows 7 later this year. I guess what will really affect it will be whether Microsoft manage to make the transition from XP smooth enough, particularily on older computers and slower hardware. Vista users are practically queueing up to get a hold of 7, so i don't think there's any doubt that it will score that part of the market.

One thing that i have been thinking about, mostly about software in general; is there a saturation point for the complexity of conventional computer software? Is there a point where, if the paradigm doesn't shift in any major way, software fully satisfys the requirements people have of it. Some will argue that software is still making huge advances, and that plenty of new applications are coming out, but i point mainly to older software. As examples, i tend to look mostly at some of the oldest pieces of software for this "age"; DesktopOperating Systems and Office Suites. Looking at windows, i think the software really met all (or almost all) user needs at XP. After switching to it and using it for a few years, people really got comfortable with it. This created the problem that Microsoft now face, trying to wrench it from the hands of an experienced and devoted following. Since XP, the upgrades to the system have felt more like separate addons, than any contribution to the actual operating system core. This could even go back to 2000, the first mainstream MS Windows to run on the NT kernel. Likewise with office programs; it feels like Microsoft's attempts to improve usability and the "look-n-feel" of Office are attempts to cover for a lack of groundbreaking new features.

This compares to years ago, when the addition of a new operating system meant huge features like the USB interface, and 3D gaming, the NTFS file system. These additions and technological benefits seem to be mostly shifting towards specialised devices, which can be better controlled, policed and protected (such as Apple's much-coveted iPhone). I think this is also, in part, due to the decrease in size of the PC gaming community. In it's hay-day, one could spend huge sums of money of yearly upgrades to keep the gaming rig up to spec, and there was no option that could match the experience. With Xbox Live and the PSN, this has changed completely. Game developers have also moved to consoles, due to better development tools, a smaller testing pool (one set of pre-configured hardware requirements), and better anti-copy protection due to tightly regulated platform. The thing that really is killing PC gaming is that this hardware comes at a pittance compared to the PC gaming hardware. A PC would require at least $400-600 a year in upgrades; adding memory, replacing components. And then there's the bi-yearly or tri-yearly building of a new gaming rig, after the old one was depreciated into the ground.

One of the new features upcoming Windows 7 is boasting about is improved touchscreen support; in particular, native multitouch support. This is an interesting addition to the operating system, and might lead to some interesting development for the touch-screen all-in-ones out there, but also could fail to be of use. My view of touchscreen up until this point has been idealist; that touchscreens will change the world. But now the technology sits at the precipice, and only time can tell whether it will prove to change the way we use computers. One major factor in the dilution of my excitement over recent developments in touchscreens is the apparent upcoming z-depth camera sensors, which claim to deliver the same experience as a touchscreen, but remotely, and with a far greater range of input options available outside of the 2d plane.

I've sort of digressed a bit; my point was originally the stagnation of computer software reaching what i would refer to as it's maturity phase in the market (as marketing/business people may be familiar with). But in exploring this matter, i suppose i have identified that the slowdown in computers, particularly at the high-end gaming section, is largely due to the migration of gamers from PC to console. The exception to this is World of Warcraft (it will eat your soul!!!). The point of this exploration is to come to the conclusion that if software does really come into a maturity and decline phase as with other products, doesn't it make sense to make software in such a phase open source, and relegate it's maintenance and management to the community of users? I know of many cases, particularly in gaming, where software isn't even sold on the shelves any more, but for some reason the developers continue to hold it as intellectual property. This is where the open source software movement has caught up to microsoft in some respects; it moves much slower, avoiding the minefields that microsoft tend to wade through. But if the software does come to maturity, then eventually the slow-moving open source software should catch up to commercial alternatives?

This is of course all theory. Software still seems to be making reasonable advances, and i haven't yet heard of a market where open and free software has overtaken proprietry equivalents. There are some case of open source (but not free) software has been successful as a business strategy, but the truly free software still advances onwards at the speed of a treacle pudding.

Ideas have been thin on the ground for me the last few months, mainly due to my complete and utter lack of time to devote to them. I did manage to come up with some interesting ideas for projects during the summer holidays, but even then i didn't get anywhere with them. Hopefully once uni is over and i'm working full time, i will be able to find some time for personal projects, maybe even a preview of Galaxy (after some more development actually gets done).

I'll try and make my next update a little more prompt.