How-To: Fix IntelliJ / Android Studio screen display corruption

With my NVidia 980M w/ Optimus graphics card, Android Studio (which is IntelliJ-based) would often get itself in a state and the screen would get corrupted, which was incredibly frustrating to work with. Luckily there seems to be a pretty simple solution:

1 – Find your user folder

If you want to make this change on a per-user basis, then you’ll need to create a vmoptions file in the relevant user-directory. I’m doing this fix on Windows running 64-bit Android Studio so I’ve created the file: %USERPROFILE%\.{FOLDER_NAME}\studio64.exe.vmoptions

Depending on your operating system, the filename and locations are:

Windows:
%USERPROFILE%\.{FOLDER_NAME}\studio.exe.vmoptions and/or %USERPROFILE%\.{FOLDER_NAME}\studio64.exe.vmoptions

Mac:
~/Library/Preferences/{FOLDER_NAME}/studio.vmoptions

Linux:
~/.{FOLDER_NAME}/studio.vmoptions and/or ~/.{FOLDER_NAME}/studio64.vmoptions

Where FOLDER_NAME is something like AndroidStudio1.5 or such.

If you want to make the change on a global basis (although the config file doesn’t recommend this), then you can modify the studio.exe.vmoptions configuration file in the bin folder located wherever Android Studio is installed (on Windows this is likely to be “Program Files” for 64-bit or “Program Files (x86)” for 32-bit Android Studio). So for me the ‘global’ config file is: C:\Program Files\Android\Android Studio\bin\studio64.exe.vmoptions

Source: http://tools.android.com/tech-docs/configuration

2 – Add this switch

Save the file then restart Android Studio and the display corruption should be fixed!

Source: http://stackoverflow.com/a/27768329.

Wrap up and secondary workaround

This worked fine for me in Windows, but as the settings affect Direct3D, I’m somewhat doubtful that it’ll work in Linux or Mac. However, I did read about a workaround technique where you go into the NVidia Settings panel, select your java runtime and force it to always work in maximum performance mode, thus bypassing any use of Optimus which may also be causing the screen corruption – so should the above fix not work for you, then perhaps the workaround will.

Cheers!

Mirror’s Edge PhysX Cloth Simulation

I had no idea you could enable cloth simulation like this in Mirror’s Edge on the PC – it looks fantastic! Will have to give it a final run-through before Mirror’s Edge 2 comes out…

Note: NVidia GPU required or the PhysX processing will occur on the CPU and very likely clobber your framerate.

Update

This all works wonderfully, until you get to glass being shot out and the game drops to 1 frame per second. And then you read up on it and set PhysX to run on GPU or Auto rather than CPU, update PhysX, rename PhysX dlls, rename folders to make it use the driver version of PhysX rather than the game version.

I could disable PhysX and it’ll run perfectly – but at this point I’m two hours of debug in and starting to remember why I don’t play games on PC anymore. Because the PC master race is a just a PC, with all its config foibles, glitches and issues. And that’s why I stopped letting it waste my time.

How To: Fix X.org Black Screen with Nvidia drivers and Linux Kernel 3.10+

I’m running Arch Linux with a Nvidia graphics card, and after doing a system update back in March 2014 any kernel past 3.10.x would cause X / Xorg to fail to start; instead there’d be a black screen and the fans on the graphics card would spin up to full speed so my machine sounded like a hovercraft. I initially got around this by running the linux-lts (Long Term Support) kernel, until that too exhibited the exact same issue and I was forced to boot the system from USB, chroot into it and switch out the nvidia driver for nouveau (the open-open source, 2D accelerated nvidia driver).

However, as I happen to quite like accelerated 3D graphics and occasionally playing games, I’ve been digging around for a fix for this for ages – and it turns out that the fix (which is now added to the Arch Nvidia wiki page) is to add the following kernel parameter to your bootloader’s kernel line:

If you’re using GRUB, when menu shows up just move the selection to Arch or your Linux distro exhibiting the issue, then hit the e key to edit the line, and change the linux kernel loading line from, for example:

To include the rcutree kernel parameter, something like this:

At which point you can hit F10 (again, assuming you’re using GRUB) to boot using your newly added kernel parameters. Don’t worry if your kernel loading line identifies your partition by UUID value instead of plain ‘ol /dev/sdx – that’s fine and mine does too (I changed it to /dev/blah to make the line shorter) – just add the parameter and try your luck – hopefully your distro will now boot into X without issue, and you can stop swearing and cursing like a drunken sailor. Or perhaps that was just me.

Further reading about the issue on the nvidia development website: https://devtalk.nvidia.com/default/topic/567297/linux/linux-3-10-driver-crash/4/.

Cheers!

P.S. If you need to chroot into your Arch system to change the drivers also, then here’s a quick run-down of the process. Make a bootable USB from the Arch ISO (I could only get this to work by using the dd command from another Linux distro – Rufus and UNetBootin would make a USB I couldn’t boot from – google or search this site for “dd iso usb” for instructions), then, when you’ve booted into the live USB/CD-ROM or what-not, run the following commands (the “<---- optional" lines are optional, don't actually enter that into your files, obviously!):

With any luck you'll have now transitioned the drivers successfully and have a fully working system - fingers crossed, good luck!

Position Based Fluids: Incredible Real-Time Liquid Simulation

Physics simulations are always interesting, and this one is probably the best liquid sim I’ve ever seen, all running live on a top-end NVidia GPU:

You can read the paper about how it all works here, if you’d like. I don’t think NVidia are gonna hand out the source code, but they’re likely to incorporate it somewhere in the next PhysX release.