And what is to prevent abuse of the system, such has already been demonstrated with Tesla's Autopilot? You can't just hand-wave away that "they should..." read the manual, obey the rules, 'not do that.' Fact is, people are doing 'that' and endangering other people in an explicit way due solely to the automation being on-board. Yes, people do stupid things with non-automated cars, but that's not an excuse to hand-wave away a genuine demonstrated safety issue that's occurring because of the automation. If this auto-land system is an "emergency-use-only" system, does it declare an emergency to ATC when activated and squawk an emergency code on the transponder? Does it respond to ATC instruction to change transponder code for special sequencing? Does it respond to ATC instruction to use a particular runway? I don't mean ten or twenty years from now when every manned aircraft has a data-link to ATC mandated. I mean today. Or does ATC just unexpectedly have an aircraft entering the pattern that's totally unresponsive to control direction on their hands? Or are we hand-waving that away with an owner's-manual reference? It's all well and good to do the engineering that can steer a car or land an airplane. Unfortunately, the real world is far more complex than just that engineering task. There are still people out there, both using (and deliberately abusing) the systems and around the systems in-use, vulnerable to its shortcomings and that deliberate abuse. You can't ignore the "human factor" with just a blythe "read the manual" legal warning. That's already been demonstrated to be glaringly insufficient.