9/23/2021

GPD P2 Max - Updating the Windows 7 graphics device driver

 The GPD P2 Max is a hand held (Ultra) PC that weights around 600 grams. It comes preinstalled with Windows 10 and can run Windows 8.1 but really shines when running Windows 7.

Its not advertised as having support for installing or running Windows 7 because the underlying mainboard of the platform is beyond that which Intel and Microsoft agreed to as the cutoff point for installing Windows 7 on new hardware.

It appears however this is simply an arbitrary cutoff as Windows 7 can be installed and runs perfectly fine on the GPD P2 Max.

The BIOS of the GPD P2 Max does not have a Legacy CSM BIOS extensions for supporting the INT10 interrupt call which some of the installers and post install support systems depend on.

 This has led to many people trying and eventually giving up installing Windows 7 on platforms without the CSM support.

 Most BIOS today are not really 16 bit BIOS, but rather UEFI Firmware pre-boot operating systems. As such a small program or shim can be written and loaded into memory before the User operating system boots in order to provide features like an INT10 BIOS call routine.

The consumer program called Flashboot Pro provides such a program and offers it during the creation of a USB boot media thumb drive for installing Windows 7. In addition it offers to inject generic copies of NVMe SSD and USB3.0 device drivers to assist with boot hardware that depends upon those devices since those technologies did not exist when the original Windows 7 installation media was mastered.

After booting from the thumb drive (which requires a keyboard combo of FN+DEL to get into UEFI bios and selecting the USB boot media as preferred) . The Windows 7 installation proceeds as normal.

After first boot and user setup.

Several orange safety cones in the device driver manager alert the user to missing device driver support for some of the  hardware attached to the mainboard of the GPD P2 Max. The physical keyboard also doesn't seem to work, so I used the onscreen keyboard to complete setup.

Most driver issues can be resolved by downloading and running the Internet enabled Snappy Driver Installer application. Its advisable to take things slow and repeatedly backup using a full system imaging tool like Macrium Reflect and a bootable large USB thumb drive to capture and deploy backup images.

Device driver hunting is full of unfortunate events that often lead to unrecoverable BSoD messages.

Its faster to restore from a Macrium image backup to a working condition than to repeat the Windows 7 installation procedure.

The most important device driver to replace or update is the windows standard video driver. By default the Basic device driver runs in the maximum pixel resolution of the LCD display possible and this can be very hard to navigate.

Windows Update and Snappy Driver Installer will Not be able to deliver a working Intel HD device driver for the GPU built into the Mobile processor.

Rather you need an Intel HD device driver of a particular vintage which supports both Windows 7 and Windows 10 installs. Many of these have been removed from the web by Intel. Third party sources still abound.

The INF file for the graphics folder needs to be modified after downloading it so that the device ID when atttached ot the PCI bus matches the device driver to install in the INF file.

This works:

%iKBLULTGT2% = iSKLD_w7, PCI\VEN_8086&DEV_591C

 Its rather confusing if you look at the problem straight on.

The literature says this CPU should have an Intel HD 615 GT2 GPU. However that device driver does not work.

What does work is the device driver for an Intel UHD 620 GT2, but once all the Intel support tools are installed they report the GPU as an Intel HD 630 GT2

 Basically.. ignore the confusion and declare victory and move on.

The device driver is fully functional and video playback is superb and quite reliable.

If I had to guess I would think the confusion arose because they released an enhanced version of the Mobile CPU and partnered it with a later more capable GPU and didn't bother updating or creating a specific device driver for Windows 7. Instead they used a later GPU which was already partnered with older CPUs which already had working Windows 7 device drivers. So effectively it has an older higher performance GPU for which there is a Windows 7 device driver.

This CPU was the first 14 nm process (smaller) die which could accomodate the larger older higher performance GPU.. so the result was a win-win for the customer.

 Effectively.. I "guess" they preleased what they thought would be accurate information at the time, and changed the die design during production, once they found they had the capability.

Windows 7 Aero is fully functional, DirectX11 is available. The Windows Experience Rating is 6.8 out of 7.9 possible exceeded only by the higher performance of the CPU. The weakest link is the high performance GPU.

I am not a Gamer.

But I would assume this is a sweet spot for gamers, this is the scenario you would prefer.. to have a more capable CPU partnered with a high performance GPU. Closely matched in fact to take best advantage of the strengths of each.







9/02/2021

Black Snow, A Fast way to Warm up Mars

The Martian moons Phobos and Deimos are made up of some of the darkest material in the Solar System. If a solar powered mass driver were placed on Phobos it could be mined for the dark material and used as a kind of rocket fuel by the recoil of the mass driver hurling mined slugs of the material into and over the Martian Polar Ice Caps. 

This would also change the orbit while also lowering its mass, and eventually place it in a Polar orbit. Larger chunks could then be deployed to continue to cover the Polar caps with Dark Carbon like material which could absorb light and melt the polar caps.

The lowering of the mass of Phobos would make it more likely to break apart and easier to mine, close to the Roche limit it should be very easy to redirect huge portions into a controlled break up over the Martian polar caps to maximize coverage and heating effects.

As relics of the Frost belt and being in orbit about Mars when it lost most of its water there is also a good chance a significant portion of the material may contain water and could be salvaged for rocket fuel for orbital travel and deceleration of incoming spacecraft.

Deimos being in a higher orbit and smaller, could also be used as an Electromagnetic dynamo to create a sustained magnetic field. Powered by its orbital kinetic energy, by trailing a long dipole antenna from the moon a distance towards the planet. 

This was proven in an experiment conducted on the space shuttle in Earth orbit. Scaled up.. it should provide some protection at certain atitudes for any travelers or bases in orbit.. as from a solar flare.. and on the surface at certain latitudes.


8/26/2021

Artweaver How to Unlock Background Layer

If you Save an Open Image file as PNG and choose 32 bit not 24 bit.

Then close the PNG file and reopen it.

The Image opens as 'Layer 1' with no Background Layer, and its unlocked.

If you save this same file with no Background Layer as an Artweaver .AWD file it retains this property.

If you save this same file with no Background Layer as a Photoshop .PSD file it retains this property.

 If you save this same file with no Background Layer as a JPEG file it looses this property and gets converted back into a locked Background Layer..

6/04/2021

Ancient history, time base correction and frame sync or genlock

Time base correction (TBC) for a VCR or used during VHS tape capture seems to relate mostly to correcting the Horizontal sync pulse and the length of the video line timing as it passes through a one or two line buffer.

A line buffer type of TBC was the most common until digital memory prices fell far enough to make field, or full frame "synchronizers" more practical.

What this means is a Level 1 - TBC (a small number of lines buffer tbc) would potentially "band" or stagger its fixes across the entire field and frame of the picture when it was output.. but usually it was unnoticeable unless scaled up, during which aliasing artifacts could appear. - This appears to be why it can be a good thing to disable an older "line-based" time base correcting circuit built-into a VCR when planning to capture and or scaling up a capture using an external device called a "scaler".

What this means is a Level 2 - TBC (only a field tbc) could potentially "jump" or de-sychronize between fields.. perceived more as a glitch, mostly depending on how bad the vertical sync retrace or VTI were between frames. Since "field based" tbcs were not on the market long before being replaced by the more expensive "full frame" capture time base correctors.. these problems are not as common as others. However they could produce a superior capture experince.. since they would not attempt to de-interlace the picture with an out dated poorer quality de-interlacing method.

What this means is a Level 3 - TBC (a full frame tbc) would normally be best.. with one exception, at some point the frame output (which may also perform a poor version of de-interlacing.. since that as a simpler output circuit than leaving it interlaced) may have to drop or repeat a frame in order to remain broadcast "locked" to a data frequency that matches broadcast standard.. while this could be minimized by using genlock.. it could also cause frame jumping vertically in a free running work flow with no genlock use. - While a free running full frame tbc would place that burden on the capture card.. since at some point it will have a a buffer over run or underrun.. and the capture card hardware or software will have to make a decision as to how to handle the situation.. its a choice.. of leaving the decision up to the TBC.. or leaving the choice up to the capture card.. as to how to handle the situation.

TBC - "time base correction" traditionally focused on "curing" problems with the Left "edge" or horizontal portion of the scan lines. While "genlock" or frame sychoonization focused on curing problems with the Top "edge" or the Top and Bottom edges of the field or frame as the field or frames were displayed and the next was setup by moving the focus of the scanning beam back to the top of the visual field.

So "tbc" is mostly about Left edge problems and "frame sync" is mostly about Top edge problems

When things go wrong, 

A flag waving from the Upper Top Left edge is fixed using a TBC or a TBC that specifically "locks" its Left edge "frame clock" before additional lines are displayed.. a TBC that waits and samples more lines may "drift" back and forth releasing some of the Top Left lines before it settles down and locks. It "locks" fast.. as opposed to a broadcast TBC which is seeking to lock on an over the air transmission which may have additional problems related to signal ghosting and locks "slow". Some broadcast TBCs have a setting specifically to support use with a "VTR" in the studio environment.. that locks "fast".

A "vertically jumpy" or "rolling" problem is a frame sync problem and relates to stably outputting a field or frame at the right rate and time to match the expectations of the broadcast standard, or house timing genlock called a black burst.

Ideally and in practice.. a tbc does not do the job of a frame sync.. and a frame sync does not do the job of a tbc. They are two very different things. However some early devices for broadcast and for the consumer combined the two functions in a single box. Whether it actually did correct the Left "Edge" or not is often the question.

And they use digital computer memory to store different parts of the picture.

A tbc samples and stores and corrects and outputs line information.

A frame sync stores all of the "assumed" correct line information for a field or frame and outputs field or frame information.

Performing a tbc on the line information before its captured by the frame sync is important.. since it avoids over and under runs in the frame sync which can make problems with the vertical "jumps" and frame sync output "worse" if it is not performed.

TBC's for the most part are no longer made.. and if so its mostly for SDI.. or serial digital inteface output. All video signals are assumed to have already been converted to some digital format and are rendered analog only in exceptional circumstances and probably not likely to be broadcast in native form.

Frame synchronizers are still made and SD or HD versions are available, often with scalers. But these regularly go for many thousands of dollars used.

Video switchers and DVD recorders are looked upon as potential sources for TBC and Frame sync work.. as pass thru devices.. but for the most part are frame sync devices only. They "appear" to be time base correctors because they have to digitize the entire field.. but more often also de-interlace and output the complete frame.. the effect is the output can have many Left "edge" problems and problems with line length since those devices never attempted anything in the slightest to correct problems with line length. Worse.. the problems are "fixed" by simply "chopping off" badly timed lines and throwing the rest away.. baking in the problems.. from which there is no current way to recover missing information.

True "time base correctors" probably stopped being made about the year 2013 and the use in consumer projects and some commercial projects have probably accelerated the degradation of the supply in the used market.. making them all but  unobtainable in good working condition. Some are servicing the short term electrolytic capacitors.. but these are rare and not often found.

FPGA efforts in the "line doubler world" or inexpensive "scaling" for the Gaming community for streaming is developing an ability to perform many of the functions of not only frame sync.. but also encroaching on the true - time base corrector realm as some older game consoles also had Left "edge" problems.


5/29/2021

Blackmagic Thunderbolt Shuttle Video Capture on Windows XP

 Finally got a chance to look at a Shuttle on XP, not the same as an Extreme.

VLC seems to sort of work with it.. but with a lot of trouble.

The Decklink device drivers reject a 720x486 or 720x488 resolution and default to something ridiculous like 720x30.

But trying the Blackmagic WDM driver seems to behave ok with 720x480, but doesn't provide an audio stream.. so right now its very confusing.

The Blackmagic Desktop Video and Driver suite on Windows XP is the first version that I know of that included support for both XP and Thunderbolt Shuttle.. so I guess its a bit early and unstable.

Noel AMCap seems to work just fine with it, and Blackmagic Media Express, and GraphEdit seems to work with it.

5/28/2021

Blackmagic Thunderbolt Extreme Video Capture on Windows XP

It would appear the Media Express capture program is not DirectShow based. 

This would make sense as Blackmagic makes a version for other platforms. It appears to use the QT user interface library which is also cross platform.

A best guess would be like VLC, they use a libavcodec or a wrapper like ff to use libavcodec.. but I'm betting they had to keep it simple. There is an LGPG notice included with the program that might refer to the QT libraries used.. or deeper in the code. I did not see a separate libavcodec dll included with the program, so if it is used.. its function are integrated with Media Express binary as a static compile. It is a medium sized executable. dedicated to one purpose.

The weak link in all of my captures has been Uncompressed at 8 and 10 bit resolution direct to hard disk.

Media Express does a good job of capture and maintains sync as far as I have tested.. but my hard drive speed is borderline and would probably not keep up long term.

VLC so far has not been able to keep up and appears to loose sync between the audio and video tracks when captured with the default Raw Record button. Adding a Transcoder to capture as MPEG2 or H.264 would seem to be a loosing proposition.. but might work, if the CPU is not overloaded and can reduce the stream speed such that the hard disk can keep up. Huffyuv is also possible with VLC and might make a reasonable compromise. -- noted that VLC is also capable of decoding and playing back Huffyuv on the fly! Which is quite impressive.

The Blackmagic Intensity Extreme is about ten years old, and in those days the limitations were as now the hard disk speed, in fact they include tools to test the speed.. and recommendations of always using a RAID0 situation to stripe the speed load across multiple disks. Today SSD might be able to keep up, but the individual SATA drive connections may still be too slow.

Noel's AMCap for Windows XP is no longer sold.. however due to his kind response to a plea for help, he did sell me a copy.. and that does appear to work excellently with the Blackmagic Intensity Extreme on the Windows XP laptop across the Thunderbolt connection.

Noel's AMCap for Windows now focuses on 64 bit versions of Windows and more modern versions of the Windows Kernels.. and had to leave Windows XP compatibility behind with the Version 9 of AMCap.

VirtualDub 1.9.11 does a respectable job of capture, but I am concerned compared to Media Express, or even VLC.. its a rather Large and Excessive program just for capture.. it has measures to try and help with hard disk speed like preallocation and buffers.. and logic for dealing with hard disk overruns.. but they are kind of overwhelming to master right away. But it does a good job from what I can see.


5/27/2021

The Internals of VLC and How it works

VLC or VideoLAN client, retconned to just VLC.

It is (or was) planned as a media player in a suite of tools for sharing audio video media files from a Satellite downlink on a University campus in France.

It was a student led effort that evolved and led to it being open sourced in the year 2000.

Because not all students used a Mac or a PC or Linux desktop they chose to design it such that it was portable across all those platforms. It has since been ported to iOS and Android as well.

Basically the terminology changes depending on which era of programming, or desktop PC platorm your currently on but its a collection of 'mini" programs called "Modules" linked together in a chain by a "Core" interface being used.

Its mostly a "loader" with a user interface stuck on the front.

The Interface can be a command line string of commands like in a command processor script, or it can be a GUI user interface for a windowing system like wxWidgets, or Qt or Cocoa.

In the background however, once its started, its just making calls to load and start up "modules" and pass them commands parsed from the user instructions to configure them.

Its very similar to the way a GraphEdit Graph works in DirectShow to load "filters" but it does not use "filters" to achieve its goals. People often assume it should expose a filter graph using the graph builder interface on Windows.. but it does not. Instead all of its functions are handled by a series of monolithic "modules" which accept commands from their own type of user interfaces.

So basically VLC was not designed "only for Windows" and does not adopt the DirectShow method of creating filter graphs for performing its functions.

At its center VLC is a ("muxer") which means it takes audio and video streams in, performs some things on them and blends them together and then shoves them out in a new format. While its doing that it can also split off a duplicate stream and "Display" what is currently going through the stream.

Initially it was intended to accept UDP and TCP streams, later reading a DVD device or .ISO file were added, then taking in the Output from a Capture device or a TV Tuner.. these were all added one at a time.. to bring it where it is today.

Just like splitting off the Display duplicate stream, the Record button on the advanced User interface can direct a duplicate stream of the Input data straight to a file, which is automatically named and stored in a pre-configured directory for videos. No processing is performed on a Stream "Capture" as a Record file.. which is rather the exception than the rule with VLC.. and a default Capture/Record processing cannot be configured from the GUI interface.

Separate from a ["Capture/Record"] is a ["Capture/Convert" or "Save"].

Using the "Convert"  button instead of "Play" (which "Play" starts a stream already configured and stored in the Playlist, or completes the process of adding a stream to the stream list and then starts playing it) spawns a Wizard for configuring an Input and selecting a predefined "Profile" which will perform processing on a twin set of audio and video streams to mux them before sending them to a final directory and file destination.

Profiles are preconfigured as VLC default installed, but new ones can easily be created.

The same Convert "Wizard" can be used to complete a File, DVD, Network or Capture Card stream along with its mux processing profile.

The "Stream" button is basically the same thing but tailored more towards "re-streaming" as fast as possible .. possibly without "muxing"

The GUI user interface layers a conceptual way of looking at the the command line way of building up a 'filter graph' in DirectShow terminology even though.. it is not a DirectShow filter graph.. its basically the same thing. "Conceptually"

The VLC command line invocation does not replicate the GUI user interface, the GUI is task based oriented. Where the command line better represents the actual process of building up a single minded, single task process of configuring a input source, configuring the modules that will operate on those source with "VLC filters" and then multiplexing them into a single stream and delivering them to a file or other destination.

Because of the way VLC is built, and its core is libavc.. its sometimes desired to "use" VLC as a DirectShow "filter" itself.

Doing this is possible.

Sensorray opensourced a VideoLAN client bridge

The DirectShow filter appears as a Sink and is preconfigured to save a stream as a processed mux file, currently only as an MPEG1 or MPEG2 file.