Hiding monitor from windows, working with it from my app only
Asked Answered
C

3

14

I need to use a monitor as a "private" device for my special application, I want to use it as a flashlight of a sort and draw special patterns on it in full screen. I don't want this monitor to be recognized by OS (Windows 7) as a usual monitor. I.e. user should not be able to move mouse to that monitor, or change its resolution, or run screensaver on it or whatever. But I want to be able to interact with it from my application. Monitor is plugged using an HDMI cable to a video card (most probably nVidia).

What is the simplest way to do this? All solutions are appreciated, including purchasing additional adapters or simple video cards, or any other special devices. The only solution I could imagine for now is to plug the monitor to another computer, run a daemon on that computer, connect it to my computer via ethernet or whatever, communicate with that daemon from my computer. It is pretty ugly and require additional computer. But I need to solve this problem.

Callboy answered 16/10, 2012 at 17:3 Comment(9)
I wonder if there's a way to get a virtual environment to use it directly?Lorenzalorenzana
You mean use VM as an additional computer from my solution? I don't know, whether it is possible to hide monitor from the OS, but to make it visible for VM.Callboy
You'd probably have to do it the other way round - run the "user" OS inside the VM, and your application outside of it. This way, you should be able to limit the VM OS to just one of the monitors, while you have access to both of them. That does have it's price, of course - virtualization still isn't perfect, and if your user has to do something GPU intensive, you're probably screwed.Permissible
@Luaan, thank you. This is an option, but the complexity of this solution is very high - comparable to using another computer with a daemon. Hope to find something easier.Pouter
Find open-source drivers for a graphics card and modify them perhaps, in such a way that it is no longer a graphics card for the OS, but your application knows the API to use it correctly. codeproject.com/Articles/12878/… en.wikipedia.org/wiki/…Dustan
Or use a Raspberry Pi, why not?Dustan
@soulseekah, Raspberry PI is exactly what I was going to do, and it seems the easiest way. As for writing own graphics card driver, I can't even imagine what a huge amount of time this can take.. But I'll study your links, maybe it'll get more clear.Pouter
See Using multiple monitors as independent displays. This article has code to detach monitors.Inquiry
@RaymondChen, that is exactly what I need! Could you post this as an answer so that I could award the bounty?Pouter
I
6

To do this, detach the monitor from the desktop. Detaching a monitor from the desktop prevents Windows from using it for normal UI.

Sample code for attaching and detaching monitors is in this KB article. Once you've done that, you can use the monitor as an independent display.

Inquiry answered 27/12, 2013 at 12:23 Comment(2)
And we have a winner! :(Nerty
Dear @NothingsImpossible, I'd gladly award a bounty of 50 to your answer, but I can't offer another bounty of less than 200;(Pouter
N
2

Building upon your own idea of using an external PC, and Mark's comment on using a VM as this "external" device:

You could buy an external USB-to-VGA video adapter like one of these, approx. USD40:

USB-to-VGA adapter

http://www.newegg.com/USB-Display-Adapters/SubCategory/ID-3046

Almost every VM software supports some kind of USB passthrough. VirtualBox is a great example. Only the VM sees the USB device, the host ignores it completely. So the steps would be:

  1. Buy said USB-to-VGA adapter.
  2. Configure slim a virtual machine and cook up a little utility to receive the images to show on he screen by network.
  3. Configure VirtualBox to connect the USB-to-VGA adapter directly to the virtual machine.
Nerty answered 26/12, 2013 at 8:50 Comment(3)
Haha, nice trick! No need to bother with hardware when you can emulate it. The drawback is that you need to configure and run the VM before starting your application, but on the other hand you don't need to configure ethernet connection to your second computer and don't need to wait for it to boot.Pouter
@Pouter Yeah! Also, VirtualBox has a nice command line interface that allows for Headless, console-only sessions - so you should be able to start a virtual machine from inside the app and keep it hidden from the user - with the USB display still working. It is some work but you can make this entirely transparent.Nerty
I've always suspected that VB should have console mode, but never tried it, thanks. The one remaining thing is to find a USB-to-VGA (-DVI/-HDMI) adaper with a Linux driver to avoid buying another license of Windows. I think we'll manage this;)Pouter
C
1

Here is another simple solution to monitor you application.

Your app should provide an API monitor service, served as HTTP on any port you want (for example http://{userip}:{port}/{appname}/monitor).

Your app monitors itself, keeping monitoring data in memory, in a local file or a database, hidden from the user. The monitor API serves this data to any device you want that has a browser (tablet, phone, netbook, android mini-PC, low cost linux device, any PC or any OS... from the internet, your LAN or direct connection to the PC hosting the app).

Pros:

  • Data to monitor is collected (and served) within your app : only one executable
  • Display can be done remotely : from anywhere !
  • Access security easily done using standard HTTP authentication mecanisms
  • You can monitor several applications (ie several monitoring URLs)
  • You are free to use any browser to monitor (even a local window browser on the same PC for testing purposes)
  • Monitor from any hardware and OS you want
  • Simple and flexible !

Cons:

  • There is few, but tell me...

Choosing this solution depends on what kind of data you need to monitor (text, images, video...), and also on what is the refresh rate you expect depending on your system network configuration.

Hope it helps :)

Culottes answered 26/12, 2013 at 14:32 Comment(1)
Thank you for the effort! Your solution elaborates the same idea about a separate device, and additionally suggests using HTTP protocol (web page) to transfer my images. Unfortunately, we don't need to monitor our app - we need to draw patterns on the monitor (display). And this process needs to be controlled by the app, not by someone refreshing the web page (and in quite a high frame rate too). This makes browser solution quite irrelevant;( Sorry, I and Mikhail should have stated this more clearly in the question.Pouter

© 2022 - 2024 — McMap. All rights reserved.