9/10/2018

LRM-519 enabling S-Video recordings

The LRM-519 is an LG DVR recorder which used the Microsoft Guide service until it was discontinued. The initial setup of the LRM-519 used the Guide service as part of setting up Inputs and Channels from its NTSC tuner. Without the Guide service it is not possible to setup Inputs for S-Video or Composite.

However, a previously configured hard disk from an LRM-519 that was setup with Satellite Input before the Guide service was discontinued can be cloned and inserted in another LRM-519 to enable it to be used as an S-Video or Composite recorder.

Manual recordings from the Satellite inputs are recorded in MPEG2 format, which can then be sent to a PC file share.

9/04/2018

empia 2861, Startech SVID2USB23 enabling audio capture

When installing the Startech SVID2USB23 which is a (composite or compound) USB device it "may" only expose (one 2861 device under "Sound, video and game controllers" within Device Manager ) [STOP] everything.. this is indicative of a Bigger Operating System problem under Windows 7.

The C:\Windows\inf\usb.inf file is ('missing') and it will not be detected as missing even with sfc scannow.

Many back up copies exist on your system, a file search for "usb.inf" with a tool like voidtools "Everything" will find them for you, and possibly Windows search targeting the C:\Windows subdirectory (although I gave up on Windows Search long ago and disabled it).

Simply copy one of the usb.inf "backups" into the C:\Windows\inf directory and unplug and re-plug in the 2861 device (SVID2USB23).

The first device that appears in the "Sound video and game controllers" represents the first Endpoint in the cable dongle, the video capture device.. the sound device is the second endpoint on the bus and fails to appear without usb.inf.

Once the usb.inf is in place the first endpoint will be detected as [ Imaging Device 2861 ] and the second endpoint as [ USB Audio ] and appears as [ Line (2861) ]  in the Windows Sound Audio Mixer "Recording Devices".. a correct "Update Driver" and installing the EMAudio driver will convert that into Sound device (2861 Audio)

VLC is one of a few programs which can properly enumerate USB Audio device as selectable Audio Input for recording or playback. Many legacy audio capture programs will simply list "Master Volume" and not give you a choice from all available audio sources. Without this level of control.. the Audio Input from the USB audio cannot be turned on or have its levels set.

The SVID2USB23 aslso has S-Video and Yellow Composite Video inputs.. if you recieve a black blank screen on recording it may need its crossbar video input switched over to the S-Video Input to cpature video signal.

More usb.inf info here:

USB Generic parent driver

And more specifically here

Tools

USBaudio 

LRM-519, enabling SMB file share uploads

The LG LRM_519 is a DVD recorder with Tuner and record to HDD. It can also be configured to [Send to PC] recording if the Windows 7 machine has been configured to be a member of a Workgroup, not a Domain and HomeGroup filesharing is not turned on.

In addition, the IPSec MMC plug-in which can be exposed (only) by creating a new MMC console and then Adding the IIPSec plugin can host a rules called [ WND ] which will block SMB connections by default and enormously slow down "Kerberos" authenticated logins, or any password authenticated logins.. in fact it will stop "correct" username /password authenticated logins in many cases.

The problem appears to be enforcement of a "Kereberos" specifc rules that lingers after passing through a Domain joined phase.. or because of an erroneous windows patch, which happens quite often.

Rather than reinstalling the machine, created a new mmc.exe instance, add in the IPSec ruleset plugin (which is not the same thing as the Firewall IPSec plugin, nor is it the same thing as the IPSec VPN connection tool.. this is a legacy totally off the radar plugin, which also buries and does not expose is rules sets.. unless it is specifically exposed by digging in the IPSec plugin.. all rulesets appear blank until highlights, and the Inbound set is clicked on and then refreshed.. quite hidden.. quite bad programming.

8/20/2018

HP Stream N3050 Braswell, booting Win To Go

More info from the edge. This weekend has been quite a bit of real experience with UEFI up close and personal using an HP Stream G2 netbook.

The BIOS defaults to booting Win10 from a GPT-EFI partition and runs Win10 from an NTFS partition. The NTFS partition is divided into two volumes on an eMMC - electronic mult-media memory chip embedded on the main/motherboard.

Unlike Chromebooks which have a stripped down and bare bones BIOS, the Stream book has full features for booting from an MBR or GPT partitioned style storage device, using an eMMC or a USB bus device connected to the main/motherboard.. but it cannot boot from a microSD storage device deven though it has an SD memory slot.

What that means is the BIOS will scan those two buses eMMC and USB for storage devices and then look into devices on them for MBR or GPT partitioned disks.

When it does any MBR or GPT partition device with either an "Actrive" or "EFI" marked parition will be listed in its selectable boot menu "esc-F9" and can be chosen to attempt an operating system boot.

MBR is a four Primary partition, and Logical parition mini-index format description system at the top of the drive for locating an "Active" boot partition within  the first 2 TB of an LBA - logical block access drive..

Limited by either the sector size and/or the number of blocks that a 16 bit BIOS can access.

The MBR slots contain the actual position within those blocks where the first boot sector for an Active partition can be found. It automatically loads the first few blocks into memory and gives control over to them by setting the CPU program segment pointer to the first one and performs an execute.. generally its a tiny machine language program that "bootstraps" by loading the next few sectors which is a filesystem driver for accessing "that" partitions "specific" filesystem and scans for the second stage bootloader for the pre-loader for that operating system.. the "preloader" generally does things specific to the operating system like preloading device drivers the kernel will need in memory in order to access the rest of the storage device.

Then the preloader loads the kernel into memory and jumps to its startup routine. The operating system decompresses the kernel, performs inventory and additional hardware specific platform "parts or bus initialization", and any plug and play autodetect and addtional device driver loading and initialization and then begins to provide feedback to the end user via the default local console.. a serial port or vga monitor.. then turns over control to a windows "manager" and presents a logon screen.. those are the general steps of most "windows like" systems.. be they microsoft, apple or linux.

GPT has a multiply redundant, "fake" MBR partitioning system for use for storage devives larger than 2TB.

A fake MBR is created at the top of the drive for marking the drive with a serial number and to indicate to old style BIOS boot systems that the drive is in use and contains information it can't see.. or that it is at least not "blank" and indicates caution should be exercised if the BIOS cannot further scan the storage device.

GPT has multiple copies of its mini-Index system for listing partitions on the drive, at the Top, Middle and Bottom of the drive. If one becomes damaged the others can be used to detect the damage and  also wage an election to determine a quorum or tribunal of which contains the accurate information without having to perform extensive bitwise parity reconstruction of the data. Like MBR the paritions can be marked with a partition "type" for MBR that is simply "active or not active", but for GPT its more extensive, more generally one parition will be marked as an EFI - ESP - Extensible Firmware Partition, with a "hexidecimal type code" which is also a FAT32 filesytem format partition intended to contain mini-32 bit programs the UEFI can load into memory and execute, be they "C" programs for performing diagnostics, maintenance or bootstraping an operating system.

Since they are 32 bit prorgams they can have longer and larger sector registers in memory to seek deeper into an LBA drive and aren't limited by the MBR-LBA limit of 2 TB on storage devices with 512, 2048 or 4096 byte sectors. Some later 16 BIOS could work with larger byte sector discs but not all.

More complex 32 bit programs than were possible with 16 bit programs in a bootsector are possible. While MBR could load overlay programs to do something similar, few hard drive manufacturers released the hardware specific details to make this possible and moving the ability into programs loaded by UEFI made it easier to be more broadly accepted.

.. to be continued


8/19/2018

Win2Go Win7 on an HP Stream G2, booting from UEFI, GPT, NTFS

Didn't take as long as I thought. The HP stream book BIOS will recognize and can be set to autoboot from a USB flash drive with a GPT, EFI partition and a native NTFS file system on an uncertified "Removable" type disc. Boot time 25 sec, shutdown time 12 sec..

Your not supposed to be able to this, for several reasons.


 

 


First Windows 7 up through Windows 10-1703 would "recognize" a USB Flash drive as type "Removable" and (A) Prevent creating Multiple Partitions - so no seperate EFI partition was possible (B) if Multiple partitions were somehow created on another operating system and present, it would enumerate with a drive letter only the "First" Primary partition it found with a supported file system, or the "First" logical partition if no others were found - Since Windows always created three to four partitions in the order - EFI, WinRE, Boot/System, Recovery - the Boot/System would never be enumerated and thus setup files could not be copied to the partition.

Certain [very expensive] "Certified for Windows To Go Win8, Win10" USB flash drives for the elite editions - could have their Removable RMB "bit" set in software to "announce" themselves as disc type "Fixed" - which would allow on Win8 or Win10 to create special Windows-To-Go bootable discs.

And if you actually did have a special certified disc and but not a special edition of Win8 or Win10 - it would refuse to continue during Windows setup at the point when you had answered all of the prior interview questions and committed to install - The message would say USB and IEEE1394 connected devices were not supported for install.. this includes SSD or external hard drives connected by a USB or Firewire connection.

This is a lot of barriers stacked against even the possibility of making this happen.

Most bootable USB Windows To Go, builder tools generally can't work around all these barriers.. some "will" partition the whole drive as type disc MBR and format one partition as FAT32 and copy a SysPrep style unattended setup.. to install Windows 7 regardless of the barrier to not install Windows 7 on a USB or Firewire disc.. it gets around this by "by-passing" the "guided interview" step that allows the Setup program to detect its an unsupported device and halts the setup.

Then you would have to have a BIOS that supports USB 2.0 booting from an MBR disc and has Legacy CSM and can boot Unsecure (unsigned) kernels like Windows 7 - which this type of computer all but disappeared after 2012 when USB 3.0 ports became the norm on nearly all platforms seeking to support Windows 8.0 - Windows 7 does not come with in box device drivers, and certainly not Boot time enabled "signed" device drivers for USB 3.0 ports - without a boot time device driver to power up and enable the port and enumerate and preset any attached USB device, the storage for the Boot/System volume is simply not available to the Windows kernel and start up would halt.

You can auto install a selection of USB 3.0 and NVMe, ACHI device drivers into all of the boot.wim and install.wim "clg" or class group (editions or versions) on a multi-install media ISO and build an MBR bootable SysPrep installer.. but different USB 3.0 hardware controllers exist - for those you can use the dism.exe tool to manually /forceunsigned install more drivers which will prevent a STOP 0xc0..07B error.

You [can] rearrange and partition two partitions on an uncertified USB flash drive, so a native EFI-FAT32 partition exists.. which a BIOS looking for a GPT partitioned disc [can] find and offer to auto boot  - by placing the NTFS partition [before] the EFI rather than [after] which is the "norm" with the Windows Setup partition creation step (and rather un-intuitive) the BIOS UEFI-GPT-FAT32 routine can find the EFI partition by its format type (FAT32) even on a large GPT disc quickly and find the UEFI bootloader and BCD store that directs it to the NTFS partition and loads the USB 3.0 device drivers boot before kernel start-type.. which loads the USB 3.0 device drivers into memory before the windows kernel.. making the USB 3.0 ports live and the devices attached to them accessible for fetching the remainder of the Windows System.. bootstrapping the SysPrep installer and completing the normal boot Windows installation.

You can take advantage of those [pre-existing] NTFS then EFI partitions to make a Boot and System partition with bootable SysPrep image for Windows 7 and enable complete setup and install.

That seems a lot of balls to juggle.

But this has Usefulness beyond this "one application" it means that USB 3.0 devices other than flash drives attached to USB 3.0 ports can be used as installation targets.. including USB Fixed HDD drives - which are normally excluded from the Setup program as a target, and Firewire attached drives. External SSD and IEEE 1394 devices are generally faster more reliable and easier to find than a high capacity USB flash drive which might wear out sooner than later. - But with 3 and 5 year Flash Drive warranties.. its mostly a personal file backup hygiene issue.. Backup4Sure is a really good folder to zip container automated backup tool which is really great for this purpose.

In old Apple terms this is a [Target Mode] boot capability for all Windows (versions) allowing many recovery and backup scenarios, or forensic and diagnostic modes. On current and UEFI only hardware. CSM and GOP boot gates are still a minor nusiance.. but can be circumvented to permit pure UEFI environments to also work the same way.

Ultimately this all boils down to a simple setup procedure.

The maintenance and adding of device drivers now depends on one command line command for adding new device drivers to the offline image while the USB drive is plugged into a Win7 or higher computer and its mounted as a plain file storage device.

No .wim files, No .vhd or .vhdx images or ISO images are used on the USB flash drive, they are all plain and simple NTFS file system files.. and thus are not limited by any 4 GB file system limits as might be imposed by using a FAT32 file system for the flash drive. Any file can be updated on another system or copied and replaced at any time.

Its a very flexible and low maintenance.. and very 'familiar' tech support style approaching that which was available with DOS FAT discs many years ago. - Even SSD portable drives were excluded from being used as "bootable" Windows To Go devices with Windows 7.

I used a [ SanDisk Ultra Fit USB 3.0 - 128 GB - 150 MB/s - Flash Drive ] for this which is roomy and fast, but it should work with many other styles and types of USB 3.0 media. The onboard eMMC flash on the HP Stream G2 was undetected and inaccessible.. but is inexpensive and hard to replace or upgrade storage.. it also holds a copy of Windows 10.. it can be used as a backup method of booting the netbook.. and backing up or copying files to the USB 3.0 flash drive when the Windows 10 operating system is taking the lead as the "Online" operating system. It can fully see and enumerate the NTFS partition on the flash drive and natively mount and read write the NTFS files on it. For some reason however.. the Windows 10 native boot is vastly slower than booting from the USB 3.0 flash drive.. and much much slower at shutdown.. performance wise I do not know if this is a failing in the Windows 10 operating system itself.. or whether it is due to the poorer performance of the onboard eMMC storage that is built into the netbook.

In any event it is quit a flexible and high performance upgrade to downgrade from Windows 10 to Windows 7 and boot from the USB 3.0 flash drive.

8/14/2018

How to Boot, Win7 on Braswell N3050 - HP Stream 11 G2

Before this fades from memory, here is how to install and boot Windows 7 x64 bit on an HP Steam 11 G2 netbook in a fairly uncompromising manner.

First a little background.

I like the Intel Atom series of netbooks, they are fast targeted and really light on battery power, they tend to last all day and were made with some really up to date tech until Windows 10. The latter generations included Windows 8 or Windows 10 "exclusive" models with eMMC memory for boot media.

The problem is eMMC is it is low tech and non-standard as in its usually bonded directly in a random fashion on a netbook or phone such that where and how to access it is unpredictable and depends mostly on a vendor manufacturer to provide a custom device driver for the item.

Windows 8 did come with eMMC drivers, and netbooks did come with a Bios or UEFI boot manager which could pick up the threads and load the OS into memory.. but it was "very slow" and prone to wear and nothing like NAND or Flash memory, or an SSD. Typically eMMC is used for storing photographs and has no controller and its just awful in general. It is better off ignored. And when eMMC burns out it can't be replaced.

Later versions of the netbooks included a "standard" USB 3.0 port or all USB 3.0 ports.. which is a lot faster than USB 2.0  Then "Windows To Go" was introduced for Windows 8.. but not Windows 7. And Windows 7 does not have bootable USB 3.0 device drivers.

The "trick" however is to create a "Windows To Go" like USB 3.0 install of Windows 7 on a USB 3.0 or USB 2.0 class flash drive and boot from that.

"Windows To Go" is not like a bootable LiveCD or ISO and doesn't entirely run from a ramdisk in memory, so effectively it is a full resource operating system running from a fast.. not quite SSD flash drive. And since USB 3.1 flash drives are available, even with their own SSD controllers external to the netbook, they can be even faster than the eMMC built into the laptop. The external accessiblity of the flash drive also makes it replaceable / repairable should something happen to the boot media.. and at 128 GB or 256 GB or beyond the hdd of the netbook becomes near infinite.

Samsung, SanDisk, Kingston and Crucial offer some nice "Low Profile" USB flash drives in 128 or 256 GB capacities which barely rise out of the Left side of the laptop and provide a suitable boot target.

In my case I used a microSD card in a chip sleeve to mount it in the USB 3.0 port. The HP Stream 11 does have a microSD slot, but the Bios is unable to boot from it directly.. also though the UEFI boot manager "might" if used in non Legacy mode. That however is research for anonther day. Besides, leaving the SD slot exposed means even more media diversity. There are two USB 3.0 ports, one microSD port and one HDMI port.. sacrificing one USB 3.0 port for a "standard" boot drive keeps many options open.

AOMEI Partition Assistant Standard Demo / Freeware includes a "Windows To Go" creator feature for Windows 7, 8 and 10 independent of the tools available in Windows 8 or Windows 10. It requires a bootable USB target which must already have been formatted with exFAT or ntfs in order to be detected. It can't see the USB drive if its not formatted.. it will wipe and reformat the drive while prepping for the W2G install, but the prerequisite is a quickformat must already have been performed just so the target can be detected.

Once it is detected and selected an ISO or WIM image must be selected as the "source" of the install files. Its important to note that while ISO is "convenient" many ISO images contain more than one Install.wim file source and AOMEI will only use the first one found. Thus for a generic Windows 7 ISO it will install "Windows 7 Home Basic" which may not be the version / edition that was actually desired. Extracting the specific (Install.wim) file for the version of Windows desired using 7Zip or Dism will allow choosing the specific windows version / edition to create W2G media.

Next executing the create function will mount and (very) slowly create the W2G installation, note it is not creating a "boot image" but instead actually going through all the steps of formatting the media and mounting the install .wim file and extracting the files to build a 'Panther SysPrep' preboot installer / WinPE image on the W2G usb boot media.. it can take an hour or longer. The Progress bar will creep along and reach 00:00 minutes left and then Pause for a good while as it completes the SysPrep image and dismounts the install .wim image all of which take a good deal of time (after) all the files have been copied and the Progress meter reaches 00:00 -- it will produce a seperate Pop-Up window that says "Finished" when done and then "Revert" to ( go again.. do not do this.. you are done..)

Once this is done the boot media is complete, but (will not work) on HP Stream 11 G2 Braswell, (until) additional drivers are installed into the "Offline" windows install on the USB flash drive.

The reason is the Braswell generation  has (only USB 3.0 ports) there are no USB 2.0 ports and Windows 7 cannot load an [On Demand USB 3.0 driver] during install or after install in order to retrieve its other Windows operating system files.

Braswell however was made by Intel and they made a NUC Kit NUC5PPYH which used the chipset, and included a USB 3.0 device driver.. as well as Intel Graphics Driver, both will be needed.. but the USB 3.0 is the most important for completing a successful boot.

USB_3.0_Win7_64_4.0.0.36.zip

GFX_Win7_8.1_10_64_15.40.34.4624.zip

Before installing the drivers to the USB flash drive in an "Offline" fashion.

They must be extracted and just the drivers isolated and placed in a directory which can be recursively scanned and then applied to the Offline SysPrep install on the flash drive.

Do this by openning the, with 7Zip and copying:

USB 3.0 driver

\HCSwitch
\Win7

folders to

C:\wim\usb3

then

Plug the USB microSD card into a slot and make sure it shows up as E:\

then

"Remove" the x86 version directories from \HCSwitch and \Win7 (recursion will scan folders and try to install all drivers)

then

"Edit"

C:\wim\usb3\Win7\x64\iusb3hub.inf

C:\wim\usb3\Win7\x64\iusb3xhc.inf

"Seek out" and change (from 3 or OnDemand) the StartType to (0) to make them "boot drivers" that must be loaded by the windows bootloader into active memory before turning over control to the kernel


C:\wim\usb3\Win7\x64\iusb3hub.inf
-------------------------------------
[IUsb3HubServiceInstall]
DisplayName   = %iusb3hub.SvcDesc%
ServiceType   = 1
StartType     = 0

C:\wim\usb3\Win7\x64\iusb3xhc.inf
-------------------------------------
[IUsb3XhcModelServiceInstall]
DisplayName   = %iusb3xhc.SvcDesc%
ServiceType   = 1
StartType     = 0


Then add the drivers to the "Offline" SysPrep Installation (since the folders on the USB drive are not in a WIM file you do not have to mount the WIM file first, simply target the E:\ drive).

The switching from an "On Demand" driver to a "Boot" driver startup type will throw an Error if you don't "forceunsigned" add them, because Normal boot drivers are "boot signed"

If you "/forceunsigned" install the drivers, then they will go ahead an install.

If you "/Recurse" install, the Dism command will search all the folders and subfolders for ".inf" installation instruction files (in-struction f-iles = in-f  = .inf  "files")

C:\>dism /Image:E:\ /Add-Driver /Driver:C:\wim\usb3 /forceunsigned  /Recurse

Installing the Graphics driver is slightly different.

Begin by extracting only the

\Graphics

folder to

C:\wim\Graphics

The Windows system about to install the drivers will detect this as a "foreign" driver and automatically [Block] it from install, attempting to Dism install will result in failure.

The failure will be indicated in the E:\Windows\inf\setupapi.offline.log file

!!!  flq:                CopyFile: FAILED!
!!!  flq:                Error 5: Access is denied.
!!!  flq:                Error installing file (0x00000005)
!!!  flq:                Error 5: Access is denied.
!    flq:                     SourceFile   - 'C:\wim\Graphics\iglhxa64.vp'
     flq:                     TempFile     - 'E:\Windows\System32\DriverStore\FileRepository\igdlh64.inf_amd64_neutral_bfb4178e08406e39\SET893B.tmp'
!    flq:                     TargetFile   - 'E:\Windows\System32\DriverStore\FileRepository\igdlh64.inf_amd64_neutral_bfb4178e08406e39\iglhxa64.vp'

<<<  [Exit status: FAILURE(0x00000005)]

In order to fix this, all the driver files must be [unblocked] which you can do one at a time, right-click Unblock.. but instead you can also use a [Sysinterals] "Streams64.exe" tool do so for the entire directory at one time.

Download

Streams v1.6

Extract Streams64.exe and place it someplace in the path like C:\Windows

This provides a guide:

How to bulk unblock files in Windows 7 or Server 2008

And finally apply the driver

C:\>dism /Image:E:\ /Add-Driver /Driver:C:\wim\Graphics /forceunsigned /Recurse

Deployment Image Servicing and Management tool
Version: 6.1.7600.16385

Image Version: 6.1.7600.16385

Searching for driver packages to install...
Found 1 driver package(s) to install.
Installing 1 of 1 - C:\wim\Graphics\igdlh64.inf: The driver package was successfully installed.
The operation completed successfully.

C:\>

First Install

Booting from the USB flash drive may require "catching" the Bios [Press 'ESC' and then  Press F9]  in order to select to boot from the USB drive rather than the eMMC -- I'm still working on a smoother way to do this.. but for now.. catch and redirect to the USB drive.

W2G on First introduction to a new system "Profiles" the hardware, and essentially performs a complete SysPrep to detect and configure the Operating System.

If the USB 3.0 drivers are Boot drivers the boot loader will load them into memory before turning control over to the boot kernel and the file system on the USB drive will remain available to the kernel and the install will proceed as a normal setup (for the First Time) on subsequent boots this will be skipped and it will just boot from the USB flash drive quickly as if it were a native HDD.

A single reboot will be necessary during which the Graphics driver and the User profiles will be setup and you will be logged into the desktop. Depending on the original [ Install.wim] file used to create the W2G USB drive 'Activation' may require differnt keys and procedures in order to activate the operating system.

WiFi Networking and Audio

the Graphics driver also included directories with Audio drivers to drive the speakers, you could have copied those and installed them at the same time as the Graphics driver or install them post setup

the WiFi networking for the HP Stream 11 G2 requires

AC-7265 driver set

additional device drivers for the NUC can be found here

NUC5CPYH 




7/15/2018

Boot Win7 from eMMC flash, a different approach

Thin PC's or Incompatible PC's since Windows 8 and Windows 10 are sort of the nor these days. The problem usually centers around UEFI vs Legacy boot mode and eMMC driver support. This is a different approach.

One of the reasons a Thin PC is chosen is the low cost and that means cheap memory, eMMC.. and since that didn't standardize until late Windows 8.1 days there wasn't a common denominator to produce a common eMMC driver for boot or normal access.

However there have been LiveCD approaches since Make_PE3 and WinBuilder that booted XP or Win7 from Ramdisk space. Essentially eMMC isn't great for Read and Write cycles and really needs to be cared for by the Operating System or will prematurely burn-out. Windows XP and Windows 7 did not have a great TRIM command for supporting SSD and certainly didn't support eMMC at all. So booting from a ramdisk and "staying" in Ramdisk space will lengthen the life span of a eMMC boot device. Ram is designed for R/W and should last a normal lifespan.

Also there is the updates and virus angle to consider.. Updates are brutal on OS stability and long term they fragment and degenerate an OS until the entire Operating System has to be upgraded fresh.

Its the data that is transient and needs to be copied from one install to the next.

Viruses and malware may insert themselves into boot media and hdd images, but can only temporarily infect or corrupt data while a LiveCD is in memory.. upon reboot the virus or malware will be wiped clean.. and is the strategy of steady state, or powerwash pursued by Microsoft and Google at this time. In a portable device.. this reboot cycle can be quite often and narrow the window of time in which a virus or malware has in which to operate.. and if its airgapped.. narrow it to approximately zero.

Thus one alternative to booting off eMMC directly, is to use Linux grub to boot a LiveCD image of XP or Windows 7 into memory and run exclusively in RAM space and be unable to touch eMMC sd space.. effectively relegating the device into a power compute module which needs supplemental storage.. be that cloud or usb disk space.. which can be attached to after the LiveCD boot.

Turning a disadvantage into an advantage.. and also lessening the urgency of Updates.. until a stellar era in which virus and malware can infect near "instantaneously" upon joining the Net.


6/19/2018

Expandable, Modular, Repairable - Component, HDMI video recorder

A mass produced brand name box is unlikely, S-Video was for SD and a product of its times. Component was early HD and replaced by HDMI. Capturing the SD or HD signal was entwined with a desire for timer or EPG driven schedulers, and multiple digitial Tuners or Cablecard slots.. dumbing that down to Component Input and timer or on-demand is one strategy but the cost and rarity of those connectors is being driven out by the HDMI single cable simplicity. Modern display devices more often have HDMI connectors so that has become the standard.

Up to HD 2K game capture latency has driven the demand for HDMI splitters, which sometimes didn't implement HDCP copy protection, so any HDMI card could be used to capture HD if caught off a splitter, but it wasn't by design and export controls actively work to find and drive those out of business as quickly as possible. Since HD 4K its my understanding that loop hole has been plugged.

Leaving Component input on legacy DVD recorders and some PC capture cards. The quality of legacy DVD recorders with Component input were poor and didn't really suit their purpose, SD video capture. It was overkill bandwdith wise and the cables costs more going from three cables for audio+composite to five cables for audio+component. Only Laser disc could really use the bandwidth and was a niche market.

And that comes to today and the home theatre pc market, also vanishing.. but more from archival apathy.. and an over abundance of trust in the cloud and belief that Copyright will be offset by Lifetime viewer rights.. tho.. if content owners could selectively erase human memories.. I think they would be overjoyed.

IMO.. magewell makes a very nice "tunerless" capture card that works with windows, linux and mac in PCIe and USB form.. it has a well defined built-in full frame TBC, DNR and Y/C comb filter with proc-amp.. full retail is around 300 usd.. ebay sometimes 100 usd. The drivers adopt the most popular api for each platform so it works with virtually any software. But being "tunerless" its not exactly on the typical home theater pc enthusiast radar.. its more "archivist" or content collector targeted.

What made the DVD recorder especially useful in my opinion was the remote, and simplicity of the task.. collect content, permit limited editing and burn to disc. Compressing and moving all those bits, even by ethernet was just too slow, and DVD-RAM never quite supplanted the write-once and done DVD-R backup.

Finding that simplicity on a pc is very difficult, unless you walk a fine line and don't try to complicate things.

The single simplest, familar interface on the pc for manipulating video content is Windows Media Center, deprecated in 2010 its increasingly hard to find.. so it is itself becoming "legacy". However, at least on Windows 7, until 2020 it is still under support and somewhat accessible.. Windows Media Center has a partner remote, and can record "live content" from a tunerless input card if it detects an RC6 WME ir blaster. these get cataloged into its library and can be added to a playlist and burned to DVD and in theory Blu-ray.

That's the theory anyway.. and I'm pursuing it as quickly as I can to confirm.

I really like DVD recorders.. some of the last ones are all linux based and have a lot of upgrade potential.. upgrading to HDMI or component input and blu-ray may be possible, someday, but their time post-burner phase has not yet come. They are too valuable as they are for the moment to the people spending a lot of money for them on the secondary market.

A lot of the lessons learned about VCRs with DNR, line TBC vs frame TBC and frame synchronizers, IRE, proc-amps and more are still applicable to an expandable, modular, repairable - Component or HDMI recorder.. doesn't help with the EPG or Tuner problems.. but for the archivist little is lost from a skills perspective.

ps. One thing to note about Component vs HDMI recording is that there is no known Copy Protection signal mitigation for false positives readily available. In the past Video Filters or something like a Grex could be used to silence the inaccurate signal degredation, whether on purpose or by accident or the result of noise.

" it is also beyond my knowledge to even know if the macrovision I, II signals that effect VBI effectively could be blocked because Components R,G,B is digitial and not analog.. however there are other levels of macrovision and CGMS flags as well now.. and Components digitizing chips recognize and honor these".

A popular method might be to use a Component to S-Video converter, that then runs through an S-Video Copy Protection mitigator, then back through an S-Video to Component converter.. but this reduces the value proposition of using Component by also causing picture quality degredation.

Also Component did not have a WSS or Wide Screen Signal "flag" procedural "standard" for advising the display device when a signal was being output in an anamorphic format (tall and skinny) that "should" be displayed on a widescreen display in an expanded pixel morphic aspect ratio. While one could be added "later" after capture, or with a specific "in-line" box for this purpose it was not "the norm" and complicated the use of Component out from sources or Set Top Boxes capable of outputing an anamorphic widescreen signal. In the beginning this wasn't much of an issue, but as DVD content became increasingly anamorphic and some cable channels would switch between 4:3 and 16:9 it has become an annoyance.

HDMI generally avoided most of the WSS problems by properly supporting it, and since the Copy Protection mitigation was a result of an oversight for lower resolution 2K signal splitter devices with low latency for game play and game recording.. temporarily at least.. HDMI has some advantages over Component recording.

6/07/2018

Waveform and Vectorscope, Bar signal on a Pedestal


I bought a Leader 5860c Waveform Monitor and 5850c Vectorscope from 1989 last weekend. Setting them up was a challenge.. this is that story.

The Waveform Monitor wasn't as much of a challenge.

Basically it has BNC composite inputs, and I had to get some adapters for my composite cables to convert them over and connect a VCR and a Time Base Corrector to its Input.

The Time Base Corrector could also serve up a 75% Color Bar signal.. which could produce the usual Stair Steps seen in so many old black and white photographs. That also let me find and recognize the dual side by side field 1 and field 2 "humps" and the full frontporch and backporch of each in the center. Along with the IRE (set-up) or Pedestal add signal that picks up the Black level in North American Video signals and "sets it" on a Pedestal just above the sync blanking level.

Even though I "sort of had guidance from a PDF manual" it was for the wrong vintage and kind of vague about terms and very short.

I learned I had to DC restore the signal to keep it from drifting up and down because the signal is by default coupled to the Input as an AC signal about a sync level used to represent the center point of the overall video signal.  The Monitor had a simple button for DC restore and focused the signal on its reference point.

I could then move the signal up and down and left and right with some alignment controls, and rotate the "horizontal level" of the scan using a small tool and a trimmer in the upper left hand of the monitors faceplate.

Scaling was automatic (or "Calibrated") or manual (or "Uncalibrated") when snapped into position the scale on the "graditule" represents the signal in terms of IRE units instead of voltages.

Good since most literature concerns itself with IRE units and not actual "voltage units".

I then put a SignVideo Proc-Amp into the signal path and played with its four controls

1. Black
2. Contrast
3. Saturation
4. Tint

The first two (Black and Contrast) allowed me to move the "floor" or blackest black level of the video signal relative to the "center point" sync refernce level. But that also had a slight effect on the top of the signal represented by the whitest "white" or brightest signal on the screen.

While the video signal on the monitor represents "Luma" or Brightness irrespective of Color.. each color bar has a declining "brightness" on purpose to create the stair steps. Left to right they fall off in perfect step with the bars on a normal video monitor.. but do not represent any color information.

This is exactly so that, the Black control only effects the overall video signal blackest black.

But after that adjusting the Contrast raises and lowers the top of the whitest or "brightest" color bar so that it could be set to IRE 100 .. or perhaps lower. IRE 75 is quoted as common, as are IRE 85 and 95 .. as a hedge against signal sources that may "overdrive" or "blow out" the perceived exposure.. loosing details in the "wash". This is called "clipping" and is to be avoided.

Clipping can also occur at the other end of the scale in the Blackest Black floor.. the goal is to keep tweaking to get most of the signal, most of the time to remain between these extremes.. which can depend upon the exact source used.. but the color bars serve as a first approximation and allow for some sand bagging of the range to protect against "clipping" at either extreme.

So the Waveform monitor is for calibrating or setting the "Black and the White" extremes of the signal using a proc-amp. And setting the Black also adjusts the height of the Pedestal for the Blackest black.. which in North America would be IRE 7.5 high (very important for the Vectorscope).

Next was the Vectorscope.

Its similarly easy to connect a video input signal, but it displays its results in a Polar or radial graph display. Magnitude is by Radius from the center, the other coordinate being an Angular value from a Color Burst reference signal.. not unlike the DC restore recovered "center sync" reference for the Waveform monitor..

And like that DC restoration.. the Vectorscope has to "recover" the Color Burst angle and decode the position of all colors from the signal arrayed in a circular fashion around the graditule or "scale" on the Vectorscope screen.

I made a mistake in seeking to set IRE to 0 for my video signal using a proc-amp to generate the Color bar signals. This caused the Vectorscope to "free wheel" or "spin" like a car drivers steering wheel.. or strobe like the struts on the wheels of a car. I couldn't get it to stop spinning, even using the phase angle adjustment control repreatedly.

Once I did try to switch IRE 7.5 (on) the bowtie pattern snapped on and stayed locked.

Also using a proc-amp as a color bar generator is not ideal.. in tiny fine print, it says you should also connect a video signal to the Input to the proc-amp composite input.. so that a "stable" color burst signal will be included with the color bars generated. This turned out to be true.

While acting as a bar generator the proc-amp cannot be used as a proc-amp, it locks all of its outputs to references.. presumably to act as a "standard" rather than a general purpose (much more expensive tool).

Radius of each bowtie, "spot" represents the relative "color saturation" for that color, as color video  has a familar pallete with the bar pattern, each bar creates one spot in the general vacinity of the graditule regions labled for their color. Angular offset from the color burst frequency determines their "color".

So a second proc-amp can manipulate radius by increasing or reducing "Saturation" and this effects the entire constellation and over all "size" of the Bowtie.

While the same second proc-amp can manipulate "angle" by increaing or reducing "Tint" and this "Turns" whe whole orientation of the Bowtie. The optimum goal being to adjust or tweak out common imperfections that lead to a "cast" or "overall" color problem that effects all colors equally.

Individual colors which require specific tweaks to Saturation or Tint requires the use of a "color generator" or "color corrector" but is independent of the video signal itself.. that would be a manipulation to grant false "color" enhancement and isn't strictly a result of the signal path or current video signal regenerator. That would be more akin to using a "paint brush" to touch up a moving picture as opposed to "fixing" a video signal to be within specifications for broadcast. And is usually something more common in film and telecine to add special effects, or add to a scene to enhance a particular emotion or psychological setting than a strictly physical situation.

So I did notice that the arrangement of the proc-amp controls from Left to Right were not arbitrary, as each from

1. Black
2. Contrast
3. Saturation
4. Tint

tends to progress from that a person would notice the "most" if left uncorrected to the one they would notice the "least".

So in this case Black offsets are noticed first, then Contrast problems, followed by Saturation problems and then Tint problems.

----+-- Black (bottom)
Y - | - ----- waveform monitor
----+-- Contrast (top)
      |
----+-- Saturation (radius)
C - | - ----- vectorscope
----+-- Tint (angle)


5/28/2018

Digitizing Analog VHS tapes - to AVI or to DVD


Among "digitizers".. people who want to transfer or convert their VHS tapes to PC files or DVD media there are two camps.

First are the professionals who know every detail about quality and quantity and do it as a business.

Second are the casual users of VHS tapes, who have not used them in years or infrequently and now find they want to perform this quickly and as a means to finally get rid of the tapes.

Among the first category are websites and forums that are mostly going quiet these days, occasionally helping one another to care for the equipment they are using to make these conversions, and answering few questions from "newbies" to the profession or hobbyist who happen to just be starting.

First its important to understand the last VCR was made in 2016 and the tapes are also no longer being made. Every year the tapes get older and degrade and these forgotten memories move closer to oblivion.

Of the remaining VCRs most are not well maintained or cared for and decay from misuse.. or become damaged from being plugged into fluxuating power lines and lightening strikes. If they don't end up being recycled or tossed in the dump.. they are given away.. and a very few end up on eBay or Amazon or Craigslist as "used".

Among the second category users generally start out with a combo VHS to DVD or some USB dongle to perform the "captures" and are sorely disappointed with their results.. they turn to the web and find the "prosumer or professional forums" and discover a new world of choice and information that tends to overwhelm.

There is also almost a "stages of grief" that sets in from the gradual understanding that what they were attempting has many levels of quality and generally the professionals tell them they've been doing everything the wrong way.. so they get their standards wound up and upgraded to "pure" and "archival quality".. seeking legenday and near mythical "unobtainium" in the role of VCRs with digitial noise filters and line and frame "Time Base Correctors"..

Eventually if they don't quit.. or run out of money and hope.. they discover the easier "MPEG2" path.. a lower bit rate and quality that for some is "good enough" and subscribes to a lower spec than "absolute perfection".

In or around 2003 to 2008 there was a fleeting moment in time when $500 to $1500 DVD recorders were "Staged" to replace the VCR as a means of copying broadcast television to DVD discs.

These could also be used to "capture" the VHS tapes being played back to DVD discs.

Unfortunately with success also comes the realization that the DVD could not hold as much per disc.. so people sought to edit out "commercials" or beginning and end credits for seasons of shows. Doing this by DVD recorder alone with no intermediary was "impossibly difficult".. enter the combo.. Hard Disk (HDD) and (DVD) recorder.. which could "Capture" even the longest tapes to its internal hard drive and let the user selectively edit and rearrange material and burn "title lists" to a single disc or break up the list and burn groupings of "title lists" to sequences of DVDs one after the other.

Great in theory and practice with a little experience.

But then all of the major makers of DVD recorders and HDD/DVD recorders disappeared one day.. and the remaining recorders aged and the DVD burners begain to wear out.

So people then looked towards settling for capturing DVD quality to PC files.. but the capture equipment usually (with a few exceptions) would not allow copying the large MPEG2 files used to create DVD discs to a PC.

Which then brings us to the Home Theater PC.. a complicated mix of presentation and workstation editing capability. Generally these are not designed with editing and archiving in mind and support is near non-existent. The standards unlike DVD or MPEG2 for DVD, rove all over the file type landscape and confuse to no end.. mastering or "authoring" a DVD from files captured to a HTPC is a soul crushing exercise.

A single maker of an HDD/DVD recorder lasted until 2017 "magnavox" and then mysteriously did not deliver a set of three new recorders in the last half of that year.. stranding many archivist with no way to finish their conversions.. or soldier on.

5/27/2018

fit-PC2i Atom 510 - Centos 6.9 i386

The fit-PC2i was a low power dedicated server module from Israel with many customizable options, mostly intended for do it yourself custom firewalls running a version of 32 bit linux or Windows 7

It comes from around the years 2008-2010

Many linux distros no longer support something so low power, or exotic.

However Centos 6.9 i386 will install on this device.

The IODD portable combo USB - CD/DVD rom drive emulator and simultaneous USB hard drive is a great way to boot quickly and switch between many ISO images. A special directory on the IODD is labeled ( _iso ) and in this directory .ISO images are placed.. a combo jogwheel and selection button on the side of the drive case allows scrolling between images inside this directory and "mounting them".. the selection is saved to a Fujitsu based microcontroller in the drive case and immediately this is presented as a USB attached CD/DVD rom drive with the selected image mounted as if it were an Optical disc.. no burning, no "actual" optical media needed.

Upon reboot the last selected disc image is automatically presented as a bootable device option. the drive case also simultaneously appears as a seperate USB hard drive, which is very convenient for offloading or onboarding files to and from an operating system that can mount the virtual attached optical drive and virtually attached usb hard drive.

The fit-PC2i has a half height microSD slot for flash media or four USB ports, two Type A and two microUSB ports.. and a drive slot for a SATA drive.

Unfortunately the support site recommended Ubuntu Desktop 8.04 as a boot option.. but this had a "Bug" in that if a SATA drive were installed, it would not be seen by the boot installer kernel.. frustrating to say the least.. the Centos 6.9 kernel correctly "sees" the SATA drive, and using advanced options during partitioning.. even allows checking off drives to use, or unchecking off drives to not use when installing the linux operating system on the device hard drive.

This is a metal case, passively cooled device.. so it generally gets "hot" and a USB powered external fan like the "AC Infinity" line up with inline speed control and rubber shock absorbers makes a very low cost and effective cooling solution.. and is very quiet.

The dual LAN ports make this Ice Cream sandwich sized server a fairly flexible platform.

Toshiba xs54, xs55 - Net Dub (copy) to PC

A random websearch turned up a 2008 blog article in Japanese regarding the xs37 (a model sold only in Japan) with a "Navi from Net" feature called Net Dub.

Net Dubbing is a term for Network Copying or "Duplicating.. hence Doubling.. or Dubbing" a rcording to another xs37 or other recorder.

As a hand held remote "workstation" for mastering and creating DVD recordings..  shuttling recordings before editing from one workstation to the other was taken into account as a desirable feature. The recordings are "not" transcoded but are at the same resolution as they were in when originally recorded.

PCs don't normally participate in Net Dubbing.. however they can with a simple protocol daemon that listens for a Netbios broadcast requesting XS recorders identify themselves with their Anonymous FTP server paths.

A simple systray windows application was created and released as Freeware. I modified the text labels for English and the result was a Virtual RD-XS recorder service for the PC.

Starting this allows you to set a download path for recordings "pushed" to the PC from the Net Dub interface on the XS recorder. The title or name for a recording on the XS recorder is used as its destination filename on the PC. An extra txt file is created with any metadata associated with the recording.

4/28/2018

Toshiba HDD/DVD Recorders

After some research and purchases on eBay, I now have one of all the major RD-XS hard disk equipped DVD recorders.

The flirtation with replacing VHS tape based recoders with DVD optical disc based recorders had problems. The optical media usually consisted of a choice between cheap write once media or more expensive read/write media. As with tape based recordings commericals and other material was recorded at the same time, however where tapes could be recorded in 2 to 8 hours per cassette, the degree of quality that suffered when recording non-standard low bit rates on optical disc was much worse. So it became even [more] desirable to perform some type of editing before burning the recording to disc.

Enter the hard disk drive familar from personal computers. Some DVD recorders included an 80 GB to 500 GB hard disk drive, which was nominally used as the default for capturing the off air broadcast. Various electronic program guides or timer based recordings could be used to automatically select programs or series of programs from a season of one or more shows and store them on the hard drive. From the hard drive one could then watch the programs and recycle the  hard disk space without burning to optical disc or [edit] out commericals and beginning and ending titles to save space and use the saved space for more episodes or to keep the bit rate higher for a better quality recording to disc.

This was the TiVO concept evolved from a DVD recorder into something like a Personal Video recorder without a normal monthly or yearly subscription to the program guide made popular by the TiVO business model.

At the turn of the Video era when NTSC analog signals were stopped and replaced by ATSC over the air, the requirement for a new ATSC tuner drove the price of the combined HDD/DVD recorder so high that many companies exited the market.

Until that time however there were a few companies that offered increasingly better and better off air recordings to hard disk drive that also digitized or encoded the analog content into MPEG file format.

Among these were the Toshiba branded "RD" for "RD Life" series of "XS" HDD/DVD recorders.

warning: These recorders are very dependent on their Remote Controls for buttons, the front of the consoles do Not have a complete set of control buttons. These recorders are not usable without their original OEM remotes, programmable and universal remotes are inadequate as replacements due to the complex documentation that refers to the OEM remotes.. it is not possible for a user to perform the complex mental translations necessary to use a universal or programmable remote with the OEM documentation in any reasonable fashion. Do not try it. Further each model has a Unique OEM remote model.. they are not usually compatible between generations or step-up models. If matching a second hand OEM remote to a model without its original OEM remote be very careful to note the OEM remote model.. they absolutely must match.

Simply.. do not buy a Toshiba RD-XS without its "original" OEM remote.. and make sure it is included as part of the terms of sale.. or return it.. its not worth the trouble.

It is [Very] common to find the remote [Not] included as a term of sale, or substituted with a generic.. or the terms will say "as..is" and "no returns accepted".. the remotes sell for quite a bit more seperately from the recorders by themselves and are often prized [above] the actual recorders themselves because the recorders are useless without them.

Basically Toshiba made many many different models for the Japan and outside the US and North American markets, only a few were brought to the Canadian and US markets and not necessarily the same models.

They arrived in three waves:

XS-32, XS-52
XS-34, XS-54
XS-35, XS-55

The second digit representing the "generation" of the recorder.

The first digit repesenting the "feature" level of the recorder.. also called a "Step-Up" level.

Within each generation the same DVD burner was used, all used an ATAPI packet based command language to burn discs.

The 50's included "progressive upscaling HDMI output" for playback only.

The XS-32 and XS-52 were known to have a problem with their handling of the IRE set-up or "Black-Level" definition in the US markets resulting in DVD discs burning on those recorders looking correct when played back only on those recorders but appearing washed out or black level elevated to "grey level" and "white-level" blown-out resulting in a loss of contrast or dynamic range.

While this could be corrected in software, the loss of dynamic range could not without preconditioning on the input signal to the recorder.. a device to effect this change was never manufacturered.. and a software fix accepting the inevitable loss of dynamic range in exchange for a normal playback of discs on all recorders was never made available. Only DVD burner drive firmware updates were ever made available to consumers via their website or later only by firmware discs only available from the manufacturer through the mail.. after the firmware was removed from the website.

Several "revisions" of the motherboard and motherboard firmware for the XS-32 and XS-52 were observed in the "wild" by consumers, but no means of deploying "updated" motherboard firmware was ever found.. the newer firmware judging by the firmware versions between motherboard "revisions" was "as-is" from the factory and considered immutable.

These were considered quite advanced "workstations" approaching the flexibility one could have mastering or authoring disc creation on a personal computer with specialized software and world wide had a great reputation.

While external proc-amps could be used in the US to attempt to correct the input capture problem with Black Levels.. quite a bit of tweaking was necessary since it also involved stretching the video signal over the dynamic range while avoiding clipping of blacks and whites and compensating the loss of chroma gain and skewing in the tint.. at best it was a complicated bargain.

The XS-34 and XS-54 would see firmware updates "specifically" to compensate for the Black Level problems in the previous generation in the US and Canada and while having a lackluster physical case appearance are considered the most desirable by collectors. They had very durable Panasonic style DVD burners with long life when serviced to remove dirt and grime on a regular basis and were on the whole quite economical. In some ways they were considered the apex of the product line. However the UK version XS-34SB and the European XS-34SG (were Not) capable of NTSC capture.

This was quite different from Panasonic and Pioneer recorders which could decode and capture NTSC signals in their UK/European world models. The XS-34 UK and European models often appear somewhat similar to the XS-34 US model but are (Not) desirable in the US or Canadian markets even if the tuner is of no issue.

In particular the UK model does not appear to have SCART connectors on the back and can be mistaken for a US model.. since the XS-34 (US model) can be quiet rare and hard to find it is a common mistake to acquire an XS-34SB or even an XS-34SG model (that does have SCART connectors on the back) thinking it "might" record NTSC signals.. it will not.. and further it cannot output an NTSC playback signal either.

The XS-35 and XS-55 were the last of the Toshiba HDD/DVD recorders imported into the United States and the XS-55 was not imported into the Canadian market. While better in appearance they were still somewhat lackluster. They did contain many features that would never appear in any other HDD/DVD recorder. The XS-55 would continue to support "Net Dubbing" even between it and the previous version in which networked on a LAN they could copy recordings between the machines without first burning to disc.

2/08/2018

Retro-fitting DVRs with a usb port

DVRs began as a way of digitizing analog video signals from aerial broadcasts, they evolved to digitize VHS signals from tapes and personal camcorders to optical disc media. Because commerical movies were released on the same optical and aerial mediums, right owners weighed in and impressed upon the designs, varying methods of protecting copyright.

Consumer video products have long since moved on from Standard Definition (SD) video products, but the older analog signals captured on personal tape based recorder products remains. Regardless of the rights management issues which have withered away and made digitiziation more difficult.. the lack of a method to even extract the MPEG2 stream from a video to optical DVD burn has consigned many DVRs to landfills or abandonment.

Many brands of video recorder have at one time or another used commodity optical disc "DVD-R" burners, which almost universally rely upon the ATA Programming Interface (ATAPI) to conduct a recording session over an IDE (PATA) or SATA bus. These are not new designs, and are well documented. The signaling cables are standardized.. and although there was flirtation with removing the microcontroller unit managing the IDE bus from the drive motherboard and placing it closer to the DVR main motherboard.. integrating it or placing it on a daughter card.. in later years.. often the signal paths remained accessible down closer to the mainboard.

That means with exceptions.. many designs had a common internal IDE signal bus, with a max speed of 25 MHz for UltraDMA100 and often ran much slower.

In fact the CD and DVD xSpeed standards would often run only at the speed of that negotiated for a particular DVD-R burner drive and for the most part remains x8 or lower for stability and due to the speeds available to cost constrained microprocessor equipment up to about 2006.. although the equipment might run into the $100s or $1000s of dollars.. the tech was simply much slower than today.

Enter the 8051 and CY8C5 generation of dedicated real-time microprocessors driven by the phone industry and other evolutionary pressures. They are much cheaper and faster than earlier 2006 cost constrained microprocessors in the DVRs. Its possible a modern mcu could be used to emulate a device on the existing IDE bus by "learning" the signal conversation and then extract the MPEG2 stream destined for the optical media as a stream of ATA packet commands transfering data, then directing that over a USB 2.0 bus to an external computer, iSCSI device, USB drive or a USB DVD-R burner of a modern design from a third party.

The small size and near complete SoC implementation on prototype boards from Cypress Semiconductor for $10 in single quantities makes it almost an excercise in software only.. with a few custom cabling requirements.. and choices over wireless or sometype of re-housed external port exposure through a faceplate.

Re-implmenting a near 30 year old IDE bus in a real time mcu using C code is no small task, but it doesn't seem insurmountable given that the bus has been thoroughly documented.. and the ATA PI interface is on the whole based on SCSI and a relatively small command set of about 40 words.

The benefits would be the continued usefulness of these aging devices for their original purpose and some possible retention of our media history.


1/25/2018

Y/C Combs, DNR and Twin Perfect

Television video signal is a combination of Luminance (Y) and Chroma (C) information into one signal. A Composite (or combined) version of these two signals on a single set of wires is called a video signal, it includes (no) audio or sound information.

Normally the seperation between the Y and C components of the video signal are distinct enough to recreate the video without error.. however the signal degrades over long wires as the Chroma information "smears" into the Luminance information effecting picture quality.

To preserve the Y and C components over long wires or poor quality wiring cables, it is better to keep the two seperate on two distinct signal pair wire sets. The S-Video standard was created to do this, and includes four total wires that are two sets of wire pairs. One pair carries Y signal, One pair carries C signal.

S-Video is a "wiring standard" and really has nothing to do with the "S" in S-VHS.

The "S" in S-VHS stood for "Super" and indicated more horizontal dot resolution or perceived Television Vertical Line (counts) also known as "TVL" .

Strictly speaking.. a Black & White picture of only Luminance information could be S-VHS and would have zero benefit over an S-Video cable.. they are entirely two different things.

S-VHS is about horizontal (across the scanline) dot resolution

S-Video is about (preserving) accurate Color information that might be lost and might destroy perceived horizontal dot resolution in the process of carrying the signals a short distance from the VHS player to the Television.

People often confuse or conflate the actual meaning of the two by saying one may effect the other.. in the final result.. the picture..  which is true.. but for different physical reasons that only (sound) like they are related.. in reality they are unrelated.

A Comb filter is used to "extract" the Y from the C information from a "Composite" video signal.

The video signal is (stored) on the Tape in a "Composite" signal format.

All S-Video VHS players have a Comb filter.


When the Tape is played back the Composite signal extracted from the Tape can be handled in one of two ways.

The Composite signal can be placed on a single wire pair and output to Television over a Composite connector, or the  Composite signal can be [broken down using a Comb filter] into a seperate Y and C signal and put individually on seperate wire pairs and output over an S-Video connector.

The Composite connector will provide a better signal to a Television than an RF connection. The Television will have less signal losses and produce a better picture.

The S-Video connector will provide a similar, but even better picture because there will be less Chroma crosstalk with the Luminance signals over the length of the connector from the VHS player to the Television.

Comb filters also offer the "opportunity" to improve the signal quality with filters and amplifiers tuned and customized to work on Luminance (Y) and Chroma (C) signals seperately which exist at different frequencies and are vulnerable to degredation in different ways. Analog and Digital "noise reduction" can be used to "process" the video signal recovered from the tape before it is output to the Television.. and this is called a [ processing amplifier ] task performed by a [proc-amp] for short.

Digital Noise Reduction (DNR) is both cheaper and considered more precise than analog noise reduction which can have non-linear characteristics (which are difficult to describe and teach how to use). Non-linear noise reduction is harder to repair or reproduce in similar or duplicate circuits.

Not all VHS players have NR or DNR circuits, as its considered a more expensive and premium feature.

Another way to improve VHS playback is to tune the tracking and control circuits based on the type of tape inserted into the system.

Mitsubishi pioneered a technique called "Perfect Tape" or "Twin Perfect".

Originally intended for "preparing" the VHS recorder function, it "samples" any tape inserted into the VHS recorder/player which does not have its write-protect tab broken off.. in anticipation that it may be used for making a new recording. By doing this it can configure or optimize various circuits in the VHS player to make the best use of the Tape provided and make the strongest recording possible. On playback it similarly is optimized to extract the best signal possible.

The on screen display for a Mitsubishi VHS player with this feature can be used to manually engage or disengage this feature on demand regardless of the state of the write-protect tab.

Mitsubishi VHS players also have a direct drive method of fast forward or fast rewind called "Turbo Drive" which was used to reduce the time a consumer was required to wait for a Tape to be wound or rewound for return to a rental store.









VHS drum heads, why so many

A VHS player was originally a rotating cylindrical drum spinning on an axis at a canted angle to a video tape pulled through a tape path. It had a seperate erase head before the drum and seperate audio head and speed signal control head after the drum.

The erase head simply "cleaned" the tape of any magnetic patterns before the tape was used by the drum heads to lay down new video information in angled or tilted tracks across the width of the tape. Each track represented one scanline across the width of the Television picture.

After the scanlines were recorded to the tape, the audio head would "skim" a small width at the edge of each scan line for its use to store sound information across the Left side of each scanline, and the Control track "skimmed" a small width at the opposite edge for its use to store tracking information across the Right edge of each scanline.

On tape this meant the very Top edge of the tape had audio information, and the very Bottom edge had control tracking information.

This didn't effect the picture much because normally those edges on the Left and Right of the Television image are hidden away under a varying sized bezeled picture frame around the central viewable image on the Television.

The audio track was mono (not stereo) in the first VHS standard. Dual tracks and HiFi were two different standards that would come much later. And by most relatable audio standards quite low in frequency bandwidth.

The control track was a timing signal which when played back acted as a feedback signal to the tape path drive motors to servo-regulate the speed of the tape as it moved through the system, so that scanlines and signal arrived at the playback Television at the correct rate in order to regenerate the video signal. If the signal was too slow, drive electronics sped the tape up, if too fast, drive electronics slowed it down.

Frame rate tracking, indexing or accuracy were never part of this Control Track mechanism.. certain specific manufacturers replaced this track with their own variation to encode extra information and provide either their own version of a frame or location tracking system.. or re-implemented the Society of Motion Picture and Television Engineers (SMPTE) frame accurate time code on the control track.. but this was rare and non-VHS standard.

Basically the VHS "standard" was feature-less and made to be cheap and easy to implement across many manufacturers and vendors. Broadcast quality producer level features were reserved for much more expensive and purpose built equipment.. or, non-VHS (non-Home) video equipment.

DV (for Digital Video) would later re-think and cross pollinate ideas from Broadcast features and Home video features to create a new "incompatible" video standard that would be low cost enough to be accessible to the Home video market.. mostly by way of the "Camcorder".. but it was not VHS... even though it did borrow some of its ideas to achieve its goals.

The confusion often led consumers to assume that DV was an upgrade or improved version of VHS, when actually it offered video with a different set of goals and compromises. In some ways better, others worse and definitely not media compatible.

Originally there were two video heads on the spinning drum. The M shaped tape path wrapping the tape around the drum allowed one head to complete one scanline copy to the tape angled from top to bottom along the length of its travel path. the next head would begin its traversal as the last head left the tape path and rotated out of contact around the backside of the drum.

So the minimum number of VHS player video heads was "two".

VHS had a specific tape speed, meaning the size of the video heads were fixed to optimize the size of the magnetic track based on this speed. The inital speed was called "SP"

When longer length video recordings were made possible by changing the tape "speed" the size of the video heads had to be changed to make the width of the magnetic tracks smaller.. so (two) additional heads were added for "LP" (Long Play).

And then additional heads might be required for "EP" (Extended Play).

The LP tape speed dropped out of favor and choice became SP or EP, even if SLP was advertised.. instead of changing tape speed.. the length of the tape was increased per cassette for SLP.. so tape speed was the same for SLP as EP... so in the end only two sets of head sizes were normally included developing into the (4-Head VCR as a consumer staple).

A "Flying" Erase head was also added to the drum so that the point at which an "Editing" Cut or Insert could be made closer to the actual point at which a video was stopped or "frozen" when playing and then engaging recording from a second VCR. This was a "Prosumer" feature rarely used by most people.. but made "Linear" editing near real time during playback on a VCR used for both playback and recording from other decks possible.

Previously the former Erase head was not on the drum and offset far enough that at the point where a frozen frame to new recorded video might overlap, or include "magnetic" bleed through of signal from a previous recording or random noise on the tape with a pattern.. leading to chroma aberations at the insert point. By moving the erase head closer to the actual recording head this problem could be minimized.

So that added a (5th) possible head to the VCR (not counting the original Erase and Audio and Control heads that were "not" on the drum)

Finally "offical" stereo was added to VHS, by [deep] recording an opposite angled, slanted set of audio tracks at a different frequency and magnetic strength to the video tracks. This minimized crosstalk between the signals and a bandpass filter could be used to further reduce the perceived "noise" in the video signal from the audio signal in the central portion of the tape normally used only for video signal.

Although this made stereo possible, at near CD quality.. it also introduced a perceived "buzzing" or possible interference when electrically switching from one head to the opposing audio head on the drum. Further bandpass filters were used to attempt to reduce the "noise".. but circuitry degradation over time meant the buzzing could increase over the years with older equipment. A technique some people used was to switch off the stereo track and fall back on the (mono only) track recorded for backwards compatibility (unless specifically used for different content like alternative languages, or narration it was a duplicate of the stero track) at the edge of the tape.. which would not have any switching-buzz noise.

The "mono audio track" is also sometimes called the "Linear audio track".. chosing between them is a good thing, [mixing] them is usually a bad thing (if they are backwards compatibility duplicates of the same sound track)  primarily because the two are physically located at different points along the tape path..any imperfection (which is very common) will introduce a slight difference or signal delay.. that manifests itself as a (tunnel) echo effect in the audio when both tracks are being played in [mixed mode].. over time on older equipment or older tapes this effect increases.

[Mixing] the stereo and mono tracks did have a purpose however, if they contained different content.. for example, an orchestral music content recorded on the stereo tracks, and a speaker, dialogue or narration content recorded on the linear mono track. In this way it was use as a simple audio "mixer" setup, and allowed for post-production with a single vcr, often called ADR- "Automated Dialog Replacement" in the film industry.. its also known as a "looping" or "loop session" recording to improve the sound quality of dialogue. Today however this can be acomplished with much greater ease in computer software mixers for working with sound and video.

The legacy effect of "mixing" stereo and mono tracks sources that contain the same content however is not recommended.

So adding two more drum heads brought the total on the drum up to (4 + 2 + 1 = 7) for the video, audio and flying erase head.. and if LP is actually supported (6 + 2 + 1 = 9 heads) .

In the end 4 + 2 +1 was more normal and advertised as 4-head plus HiFi audio plus a "Flying Erase head".. if the product was a high-end model intended for limited Insert Linear video editing between two or more VCRs.. also called "decks"




1/07/2018

Capturing VHS to PC, before its gone

I've been busy exploring (or re-exploring) the process of converting VHS tapes to PC files before the equipment is totally gone, or the tapes disintegrate. JVC and Funai have stopped producing VHS players and eBay is starting to run out of even used VHS machines.


I started with a simple survey of the methods and tried to pick a simple path of USB dongle to simple capture software, like Virtual Dub to computer file. But two things occurred.

1. I didn't realize how important a good VHS player was and the initial results were bad
2. I didn't know nearly enough about VHS video signals to make reasonable decisions about equipment or software

So I turned to some online forums like VideoHelp, AVSForum and DigitalFAQ

Time and again I reached a point of decision, only to collapse when I posted a summary of my efforts and learned there was still much to learn.

So a quick knee jerk decision to pick up the project turned into months of reading and correlating and embarassment online from people who seem to know better than me, because they were retired and had been in the broadcast business for many years.

Meanwhile a clock is ticking.. not only on the tapes, as they are getting older, but on the hardware and its availability.

VHS playback is a very complex thing.

To understand a good VHS player versus a marginal or bad one, you have to start with an ideal assumption of the source tape. Then imagine all the things that could go wrong, and could be handled by the choice of VHS player, or anything you insert between the player and your capture device.

You don't normally have access to a perfect tape, or perfect capture device.. and what could go wrong can only be speculated about, or told to you by more experienced people.

So starting with the near worst case, a broadcast over the air signal captured by a TV tuner, and then put on a tape.

I learned the VHS system was invented by JVC (Victor Company of Japan) and the first VHS recorder/player was the HR-3300 released in the mid 1970's.

Three major companies in Japan were working on the Home market video player. Sony,  JVC and Matsushita (aka Panasonic). Sony wanted to use the "C" method of tape lacing around a helical drum and faster tape which only recorded one hour of video. Sony offered their system to JVC and Panasonic who turned it down in favor of "M" tape lacing and slower tape to fit two hours on a tape, thinking Home users would prefer to save "Movies" on the slightly larger and expensive tapes. Most of that was academic and they both turned out to be right in different market segments.. but somewhat like Compaq vs IBM years later.. the lower cost option to the consumer and greater "choice" or "confusion" led to VHS being the most popular.

So VHS stands for "Video Home System" and JVC marketed three version of their recorders:

HR - Home Recorders
SR - Service Recorders
BR - Broadcast Recorders

There were others targeted for particular industries, but these were/are the most accessible to people today.. but are becoming scarce.

Video signal is a strange kludge of "encoding" and "compression" through hardware circuitry rather than digital processing. Somewhat like the typewriter, it had to slow things down.. because the equipment of the day was much slower than today.. so it had very limited bandwidth in which to transmit even a luminance signal... bright and dark spots on a screen.

A Television signal is basically two overlapping pictures called fields, these are transmitted one after the other.. they are taken at two different times, so there is a slight "gap" inbeween them in which motion is missed, and a difference can arise between the two pictures if displayed at the same time.

A normal Television is designed to never show both pictures at the same time. The human eyes persistence of vision while one picture is "fading" and the next one is being brought into existence and shown, leads to a phenomena where the brain "interpolates" or "automatically fills in the visual gaps".. because two pictures are being shown, but not at the same time, a full frame with all of the vertical resolution is called [Interlaced].. literally time space "woven" from (odd) and (even) lines from either picture.

So two fields make up a frame of video, and then the next frame is constructed by showing two more "Interlaced" fields.. doing this means the signal for a full field only needs one-half the bandwidth that would be needed if both pictures were interwoven together and transmitted at the same time.

It also means that although to show one frame takes 30 frames per second, it appears as if the motion is occuring at 60 frames per second.. motion resolution is preserved, even though frame rate is not.

This is important to know, because a computer screen, and modern LCD TV's display in what is called "Progressive" mode.. or one frame at a time, no fields.. if that happens the difference between the two [woven] fields into one frame become noticeable by the human eye.. a Progressive display renders the video "de-interlaced" which is bad most of the time.

A person will notice it [more] when there is faster action, or more "difference" between the two field pictures that get woven into the one "progressive" frame.. these look like Herringbone or "Mice teeth" or zig-zag "lines" around moving things in a video scene.

There are different ways of  "compensating" during capture or after capture and during conversion to a compressed file format to make an Interlaced video look "better" when it is displayed on Progressive display.. but these methods keep changing from year to year.. and the previous methods are generally regarded as "poor" compared to new methods rapidly.

Its now considered [bad] to even attempt to "squash" or "de-interlace" video that started out "interlaced" when it is captured. Better to store it interlaced and let the software displaying it in the future use modern methods.. or if the display device is capable of display interlaced video as interlaced video.. give it that opportunity. -- the old reasons for "de-interlacing" when capturing or shortly after.. have gone away.. de-interlacing "on the fly" during playback was once considered slow and CPU intense.. and of poor quality on low powered devices like cell phones.. most have specialize hardware for doing that now, and CPU power has increased exponentially.. so it is no longer an issue.

It's also important to know that capturing "Interlaced" video as "Interlaced" depends on certain "minimal" vertical resolution in the capture device. You can often configure a capture device to capture at 320x240 or 640x480 and so forth. The second number is how many vertically "stacked" horizontal lines to capture. For VHS signal the vertical resolution is set by the NTSC signal standard at 576 or about 480 (it varies because of the vertical blanking interval above and below the scanlines which may be hidden or not shown in a TV with a bezel), capturing at anything less than that and the resulting video file will not have enough information to recreate an "Interlaced" field effect which an "Interlaced" playback device can use. It would effectively have to "mush" it altogether and treat it as Progressive, with all the attendant problems that would cause.

Video signal also has a very confusing standard for measuring [horizontal "dot" resolution].

Called TVL - "television vertical lines" it sounds in english much like a reference to "the vertical axis" in a typical mathematical class X-Y coordinate system.. but it deliberately does not mean what it sounds like it means.

Rather TVL "vertical lines" means the answer to a question for a video signal that is observed on a monitor or display device. ["That is"] it answers the Question: How many Vertical Lines can you see horizontally in a video image, along a horizontal distance equal to the vertical height of a video image. -- It is [Not] "count" the total number of dots you can see across the entire "width" of a horizontal video line... but (only) the total number of dots representing the "tip top" of vertical lines runing from the top scanline to the bottom scanline across a horizontal distance defined by the vertical height of the video.. the vertical height of the video is a known quantity, its a fixed number of stacked scanlines, it does not vary even if you can't see them.

I think I know why they picked this seemingly "weird" definition, but it doesn't help new people to understand it... that have no prior experience.

First the horizontal line is not always the same length on every monitor. A display device in the years when monitors and TVs were used had a bezel.. and that "hidden" portion of the vertical stacked horizontal line was [variable]. So they "intentionally" defined the "test" or "Question" to end somewhere within the center of the line, and not necessarily starting from the Left side, like on a number line.. because the Left side could be hidden under a bezel as well. -- so much for the history lesson.

But worse, after they got this answer, they intentionally continued to refer to this answer as the "vertical resolution" because the test involved counting striped lines that ran "vertically" even though they were being used to measure "horizontal dot" resolution.

In the digital world of progressive displays, things "line up" more like the X-Y number line from mathematics, a 640x480 image is 640 pixel across, and 480 pixels up or down.

In the video world of interlaced displays, things get called weird things. First its assumed you know the frame height is the [sum] of two fields, totaling up to about 576 lines stacked one on top of the other vertically, but some of that will be lost to top and bottom "bezel" or "blanking interval". But then they refer to the "vertical resolution" as "lines", which are lines.. counted "horizontally" to give you horizontal "dot" resolution.

The end result for a VHS video image, is there are about 480 vertically stacked horizontal lines, and about 240 horizontally aligned (like dominoes) dots on each single vertically stacked line.

To put it another way, in the digital world perspective, the resolution is about 240x480 for a VHS video signal. (For a Black & White picture)

Color images use the same bandwidth, but a subcarrier to bring [chroma] information along to decorate that same line with color information, in televisions or displays that know how to use the extra information. So if you want to capture the color information along with the b&w [luma] information, you have to take more samples from the same line.

Televisions displayed color by firing signal at triple dots on each horizontal line, red-green-blue, in various arrangements, linear, circular cluster, ect.. but to be effective from a distance they had to appear as [one] colored dot. To a modern digital display that larger dot appears as [one] color.

But in order to capture all that information from a line of dots that may or may not be colored, the capture device must "sample" the line [three times] as much. So even though the horizontal dot resolution is 240 [luminal], to capture in color, you must sample at 240 x 3 = 720 [luma+chroma] for an overall capture resolution of around 720x480 to get the entire frame.

A "poorer" resolution signal, like with the limited bandwidth on a VHS tape would be limited to less than 240 TVL (vertical tick lines crossing the horizontal axis) leading to something like 200 x  480 insterad of 240 x 480 for a broadcast signal. So to capture a VHS signal, even accounting for color information (200 x 3 = 600) x 480 which means a 640 x 480 capture setting for a VHS signal digitizer is usually (more than) enough sample resolution to capture (all the signal that there is) coming from a VHS player.

S-VHS was a different video standard which some people could afford, if they had better tape quality, a better signal source, and a VCR that could record in S-VHS to an S-VHS tape.. not a common occurence, but S-VHS VCRs eventually became the norm. And when forced to record broadcast as S-VHS on S-VHS tape, it did look slightly better. More often though people didn't have S-VHS recordings even when they did buy S-VHS tape because they never bothered to force it to record in S-VHS mode.

It is important to recognize though that S-VHS saved at a higher "tvl" or vertical line resolution

(remember: vertical line resolution is actually horizontal dot resolution, the number of vertically stacked horizontal lines was still 576 or about 480 visible, set by the NTSC standard).

This meant the horizontal "capture" resolution should be increased beyond 720 to capture the extra horizontal line information. Advertised as "greater than" 400 "vertical lines" (TVLs) 400x3 = 1200 or new capture setting 1200x480

(but no Broadcast signal could reach 400 TVL lines of resolution.. only locally generated signals from computers, or certain Laser disc players - maybe special Cable boxes, [later BluRay players] could provide 400 TVL lines of resolutiion)

S-VHS-ET and Super Quasi Playback.. were [not] a new video standard.

Basically they let you [declare] a tape as S-VHS capable even if it was really intended only for VHS recordings. In that way you could buy cheaper tapes with the understanding that quality of the recording might vary between the brand and quality of the tape actually used. Though many many recorders picked up the label as a feature, it wasn't used that often. Quasi Playback was a feature on later VHS (only) recorders that allowed playing back S-VHS recorded as S-VHS format on a plain old VHS recorder in a slightly "fuzzy" image mode.. simply to get backwards playability under marginal circumstances.

HQ was a somewhat less successful attempt to encourage enhanced noise reduction to improved picture quality, but not really any change to the VHS and S-VHS signal format standards.

In the audio space, VHS started with a single monotrack audio track that was stored on the edge of one side of the tape like a cassette recorder. Rarely this was upgraded to a low quality stereo dual track that split the same mono track space into two tracks with half their normal resolution.. it was very uncommon. Later (HiFi) ability introduced additional video drum heads to lay down a special "deep" track for stereo [underneath] the video tracks. This was popular for a few reasons, not the least of which was near CD quality audio. And the freeing up of the original mono track so it could be used for "dubbing" alternative audio, like another language, or using it for custom time codes, or allowing wiping the audio and replacing it with a new audio track without editing the video.

HiFi had a slight problem however in that "switching" noise due to the "switch" from one audio head on the video drum (as it swung around like a merry-go-round) to the other, could sometimes be heard as a low "buzz".. Dolby equalization was used to bandpass limit or minimize the problem.

For various reasons, sometimes people would choose to "change" the Audio source selected when playing a tape back, and this became the [audio monitor] switch on many vcrs, Norm generally played HiFi, Linear/Mono the original "linear audio" track on the edge of the Tape, Mix would play both HiFi and Linear (and give a tunnel effect), and Left or Right would select one or the other stereo channel. For backwards compatibility a Linear mono track was normally always recorded at the same time a stereo track was being embedded "below" the video tracks in the center of the tape.

So while capturing video, the audio may be as low as 8 kHz or as high as 21 kHz and the sound card used to capture with some video capture gear should sample at 16kHz or up to 44 kHz to make sure all of the sounds dynamic range is captured. Even though less will often be more than sufficient.

Obviously for the Linear track, the slower the tape moves, the less bandwidth available for sound and thus its dynamic range will fall quite a bit.

Beginning with the HR-7000 series for JVC and with different models in the SR and BR lines, JVC introduced various features to improve signal conditioning before recording and during playback. Some reduced noise and boosted signal to noise ratio, others stablized the video signal to better conform with NTSC standards and still others sought to improve the seperation between luma and chroma information before recombining them to put them on tape, or when extracting them to send to optional S-Video jacks.

S-VHS is not S-Video

S-VHS was a format declaration regarding how a video signal was processed and stored on video tape.

S-video was an electrical and connector standard regarding the seperation of luma from chroma information.

Video signal normally has luma and chroma mixed together, which tends to blend or "smear" when transmitted over long distances or across poor quality cables. By keeping them seperate as much as possible this smearing effect does not happen and video quality remains higher.

Not all VHS players have S-video connectors, but it was relatively common in later years.

Selecting a VHS player, or VCR recorder is made more difficult once all of the options in later equipment begins to be understood. Its easy to focus on the worst case scenario and become paralyzed with fear that an uninformed choice will be made.

But new gear is no longer being made.

The gear made last and thus (newest to today) at the end of the production line is not necessarily the most appropriate and its been reported that cost saving measures on the final machines rendered them worse than slightly older machines. Again a contradictory if not unhelpful conclusion.

Add to this, the brands and the lines from competing companies. (aka Panasonic, JVC, and eventually Sony) varied considerably in quality and reliability.. and performed variably with tapes originally recorded on competitors equipment.

Its generally presumed that if you have a lower than 100 number of tapes a service burea specializing in VHS to DVD or computer PC transfer is the lowest cost option.. but these businesses are disappearing and the quality in their service is also variable.. word of mouth being the best option to find a good one.. or trials with test tapes on a personal basis.

Learning all about the lines of one of the big three previous makers is time consuming but is probably worth the time. Recommendations for Panasonic and JVC are easiest to come by.. generally the ProLine for Panasonic or Service (SR) line for JVC are good (if not expensive) places to start with.. and then choosing only models that have a built in "line TBC and some sort of video noise reduction system"

Tracking

Is the feature of all VHS players to read the bottom edge of a VHS tape and extract a pulse which indicates the relative speed of the tape past its playback heads. If it is too slow then the machine will attempt to speed the tape motion up, too fast it will attempt to slow it down. It is this real-time "feedback" loop which is the first line of stablizing a playback picture. To that the TBC will attempt o regenerate the NTSC standard video signal and correct any errors it detects. Other features will attempt to improve color purity or reduce spurious noise in the signal. A tape can however stretch over time, distort or loose its tracking information entirely, in which case "good" playback on one brand machine may be unsuccessful, but may succeed on a different brand machine.

For this reason transfer professionals tend to own or have access to multiple brands and models and can try different machines to "try" to capture a clean transfer. This drives up cost and makes using a service burea all the more attractive.

Acquiring a VCR is only the beginning

VCRs are all old, even the last Funai or Sanyo came off the line in 2016. The machines have rubber belts, rubber tires and rubber pinch rollers which are designed to last about 1000 to 2000 hours or about 6 weeks of continuous use before needing cleaning or replacement. Those parts also age just sitting on a shelf, ozone and other air impurities, or debris from tapes when played back can attack the rubber and accelerate part aging. For large quantities of tape, its likely if not as soon as its purchased then soon therafter it will need professional service from someone who knows how to disassemble and inspect the brand and model you acquire.. those people are retiring or moved on from their last jobs.. they are becoming increasingly scare and expensive in dollars and time to find. Almost before buying a VCR.. you need to figure out who and where it will be serviced. And shipping cross country the VCR for repair, it can easily become damaged, stolen or lost.

But once you have a working VCR and its producing a clean signal

There is still the issue of signal stability and macrovision or copy right protection distortion.

Copyright protection was enforced by deliberately damaging part of the video signal, capture devices detect this damage and can judge even false video signal damage as legitimate copyright protection enforcement.. and refuse to capture the signal.. or at the very least.. even if captured.. the damaged signal may appear distorted or damaged.. usually as flickering from bright to low brightness.

The line TBC and noise reduction circuitry in a good VCR can clean up some problems, but macrovision copy protection was enforced at the frame level.. which all but the most expensive and rare VCRs did not perform. So often an external (or "in line") full frame TBC is needed to correct frame level video signal errors.. and although not intended.. this will also correct copy protection damage to the signal as well.

TBC - Time Base Correctors

TBCs come in several cost ranges and some brands are known to be better than others, universally however they tend to perform poorly if run for long times or allowed to overheat. Generally the lowest cost ones that are suitable for real use start at about $400 minimal but often cost much more, if they can be found.. they are also no longer being made.. like the VCR the need for NTSC video time base correctors is going away. Over the air digital video signals no longer have to conform to the old NTSC standards, and although there are exceptions.. the old video signal standards no longer apply, so the equipment is no longer needed. Another reason a service burea can be more attractive.

Video Processors

Video signal can loose luminance, or chroma information, or it can become skewed or desaturated. An external video signal processor can artificially be used to restore the base or "floor" for some signal problems, or reduce the "ceiling" in some cases. This can often be done post capture as well at greater CPU and time expense, but like photography, once detail is "blown out" it can never be recovered.. so sometimes if you have an external video proc to put "in line" it can be beneficial.

Video Capture

Video capture device choices usually depend on budget, but also available equipment and intended post processing (if any) and final destination. In the past people usually chose to capture to MPEG2 to save space and because a great deal of effort went into the DVD standard and people are familar with the output quality.. its a familar "known".  People worried however about "editability" and advanced editing "later". Simple commerical cuts and joins are relatively easy and can be accomplished with experience without re-encoding and suffering a "generation loss" that will reduce the final output even if it is MPEG2. Greater compression to H.264 or DiVx is becoming more familar and "possible" but there are great tradeoffs long term to storing archival footage in such a compressed form.

On the other hand, precious footage kept at full capture resolution and left uncompressed ( a wedding video for example ) while larger on a data DVD or hard drive is better kept stored that way, and compressed DVD "print" copies made for other people as needed.

A simple "raw" AVI capture at full resolution can generally be handled by any video capture card, or USB capture dongle.. however the capture software used usually needs to be compatible with Windows and DirectX so that a range of options in PC equipment and operating system versions can be selected. Video capture devices tend to be sensitive to overheating, and "dropped" frames. A "raw" AVI capture captures video seperate from the audio and combines them into the final file.. the capture software used to drive the capture hardware will have to "compensate" for any dropped frames, and note them so that they can be account for.. usually its best to choose and minimze the possiblity of dropped frames.. or the video and audio will drift out of sync.. compensation can be anything from [a] allow them to drift an let you fix it later [b] chop up sections of audio to keep them in sync [c] duplicate frames in the file to make up for the difference .. all of which have consequences.

A more complex "MPEG2" capture will capture at full resolution inside the video capture hardware and then compress down into streamed video databits intermingled with audio databits. The combined data stream will "lock" the video and audio together so the compensation is automatically and immediately taken into account. Since the hardware is dedicated to capture and compress dropped frames do not lead to drifting video.. they are "locked".. a capture device may have glitches, but never out of sync.

Mac's and DV - Digital Video

Apple Mac systems were and are known for video editing, but capture was never they're long term goal. Some capture hardware is available, but the cost and selection range are much more out of the control of the customer. DV - digital video was a popular camcorder and studio format (actually two formats with vast tradeoffs).. which left "video capture" entirely up to the video "filming" device.

That is the camcorder had a "Static" or fixed choice of video capture luma and chroma resolution and one digital file storage format on video tape, which stored a digital "file" instead of a audio and video signal. DV "capture" is a misnomer, by the time the DV tape is written in the camera, it is a digital file.

Instead DV "transfer" over firewire is merely the copying of a digital file between two computers. And "printing" or dubbing is merely the copying of a file from a NLE- Non Linear Editor, back to the digital storage tape in the camera.

In this way DV tapes are merely "data tapes" and normally "video capture" never happens at a desk, its done real-time in the field.

DV pass-thru can use the inputs and outputs on a camcorder to "pass-thru" a video signal through a camcorder in order to use its video signal capture circuitry to create a digital DV file and send that over firewire to another DV camcorder or NLE computer.

But its very important to realize that the hard decisions about to compress or not to compress, and which hardware or software to choose to use.. are already made. The encoder was chosen by the DV standard, and re-coding or "transcoding" in a PC or Mac in a NLE - non linear editor software will represent a "generation loss".. DV is finalized "cooked" video it can't be changed after the fact without loosing data. But for simple cuts and joins, re-encoding or transcoding isn't needed. If the limited dynamic range of the scenes "filmed" are acceptable (like for news reporting) its okay.. but not for things like "movies".

DV consumer also used a less detail saving compression choice to make it cheaper for people in the late 1990's when equipment was much slower, so more detail had to be sacrificed, than it needs to be today.

DV broadcast (or one of the variants) used tapes that stored fewer minutes but also used a less compressed encoder to retain more image quality.. it was only available to users of "DV broadcast" equipment and even today is not that great.. there are better choices now.

In fact people did commonly "dub" or (duplicate by dubbling) VHS to DV in combo recorders for a time, later to be replaced by VHS to DVD recorders.. and finally leading to off air recording to DVRs like TiVOs and personal PVRs like Windows Media center on the PC or EyeTV on the Mac using dedicated MPEG2/4 capture tuner/encoders like the Silicon Dust HomeRun devices.

Linux

The linux operating system evolved from the early 90's as little more than a boot loader to a full fledged graphical desktop by the year 2000. Its users and developers sought to "dub" or duplicate much of the evolving functionality of the most popular operating system(s) of the day.. mostly Microsoft Windows.. by reverse engineering the software and hardware created for sale to users of the most popular operating systems. Thus while inexpensive, and a great training ground for future programmers and daily users with well defined needs.. it was mostly a "build it yourself" and "self supported" operating system. The drivers of video graphics production and capture were usually financially motivated and protected their investments by patenting their software and hardware methods to prevent financial competition. Discovery through reverse engineering being the only way to construct a framework, occasionally companies going out of business would "gift" their technical knowledge, but this was rare.

In that light, Linux haltingly started and stopped many video graphics production and capture projects, ultimately the winner being the mostly kernel based driver interfaces called "Video for Linux, abbreviated v4l and later v4l2 (version two)". Plug-in drivers could be created to "expose" features for capture devices which software could then expect when searching for capture devices as that software started up. Several semi-commercial and freeware NLE systems emerged to support v4l2 but always lagged. -- at this point in history capture is pretty well supported for a small selection of video capture hardware, but since video capture hardware is no longer being developed.. that is its no longer creating new capture systems since the signal standard is on the decline.. the choices will probably remain the same or decline.. they won't increase. Having more choice among many video capture hardware choices is probably better, since some will have bugs and only be discovered later.

So while "possible" Linux is probably not the most robust, frustration free video capture platform that could be chosen, without being brought to using linux by the "choice" of the video capture "first" and told linux is the best operating system for using that hardware. Choosing the operating system before the hardware is rarely the best case for video capture hardware.

What replaced DV tapes

Like VHS the time for DV tape is also passing. Camcorders and "filming" equipment started slowly but has transitioned now to mostly "Progressive" recording of digital video using a myriad number of compressed or noncompressed encoding methods direct to data files, first on small portable hard drives, then custom solid state memory cards, and finally to plain old SSD.

Since the "Progressive" method has much more in common with actual celluloid "film" than older "Interlaced" video signal camcorders.. studios and broadcasters eagerly adopted the transition and in some cases no longer even use actual film as the budget proposition inevitably trades places as to which is more expensive.

While JVC and Panasonic and many others adopted the DVmini cassette, Sony invented the Hi8 format and there were a few other formats like VHS-C but the majority were DV centric and like VHS is what is mostly being transfered today. For the most part DV tapes are become scarce and hard to find except as New Old Stock (N.O.S.).

DVD to BluRay

PVRs and Streaming have mostly reduced the market and demand for personal storage, cell phone video is generally much smaller and uploaded to the cloud for long term storage. Or personal videos are kept on local hard drives and synchronized with long term backups. The storage format changes but is generally unimportant to the end user since transcoding on the fly has become common place in the background while displaying a file.

DVD as a long term format is being called into question, even using the M-Disc format, but even when burning a DVD-R it takes time, and the reliability and ability to scratch or damage a disc calls into question their viability. Also many DVD burner makers have left the market, Samsung, Sony even HLDT and LiteOn seem less likely to remain much longer. Blu Ray and Double Density Blu Ray as a movie storage format is strongly resisted and Blu Ray PVRs are highly restricted in the US from burning copies, or must comply with Burn Once rules.. meaning there are few Blu Ray PVRs and they must compete with Hard drive or Streaming storage formats in the cloud.

And all Optical burner and player media tend to have rubber belts which wear out over time, meaning optical drives unless recently manufactured will eventually become inoperable.. as demand goes down they are likely to increase in price.. so as a long term storage format, there are some serious questions to think about.