Abstract Submissions

You are cordially invited to submit an abstract for your proposed presentation or demonstration or poster in one of the topic areas listed below. Submitted abstracts should enhance the discussion of the various initiatives and technology supporting the Safe and Secure Systems and Software Symposium (S5). Only unlimited distribtion and non-proprietary abstracts will be considered for acceptance in the topics listed below.

Please make sure you read and understand the rules before submission.

Critical Dates: Due by:
Abstract Submission 02 Jun 2017
Abstract Notification 16 June 2017
Final Presentation Submission 24 July 2017
2017 S5 Event 01-03 Aug 2017

General Rules for Submitting Abstracts

1) Authors are asked to submit their abstracts electronically in Microsoft Word (version 2003 or later) or Adobe PDF format (version 5 or later). Please limit the length of your abstract submission to approximately 500 - 1000 words (i.e., roughly 1-2 pages,single spaced). Abstract submittal deadline is 02 Jun 2017.

Please include the following when submitting your abstract:

  • Title of the Technical Presentation/Demo
  • Desired Session/Topic Area
  • Presenter Name(s)
  • Title
  • Address
  • Telephone number(s)
  • Email address(es)

2) Abstracts must be submitted via email to the S5 Program Committee at S5Submissions@mys5.org.   Email submissions should have a subject of the form: 2017 S5 Abstract Submission - <lead author/presenter's last name>. Attachments should be in Microsoft Word 2003 or later or Adobe Acrobat (i.e., PDF) format. A late submission will not allow time for proper review by the Program Planning Review Committee and may result in the rejection of the abstract. Letters of acceptance or non-acceptance will be emailed to the main author of the submitted abstract, by 16 Jun 2017. If notification has not been received by the above date, the authors should contact the Program Chair and/or Administrative POC listed on Contact Us page of the website.

3) Authors making presentations/demonstrations on technical efforts performed under a government contract will be responsible for obtaining all necessary approvals and releases from the appropriate government agency for unlimited, public distribution of said material PRIOR to the Final Presentation deadline noted above. Submitted presentations must be appropriately marked with the proper public approval number or associated text as specified by the approving organization.

4) Abstracts must include a description of the content, its basis (e.g., a detailed case study, discussion of experience from practice, preliminary research data, completed research not presented previously, review of literature, existing process and/or regulation, etc.), and significance of your proposed presentation.

5) Depending on the number of abstracts received in a given area, the S5 Review Committee reserves the right to recommend your abstract be converted from a presentation to a poster or vice-versa if your abstract is selected.

Session Topics are as follows:

Note: These Session Topics Areas have been approved for Public Release. Case Number 88ABW-2017-2218.

1. Assurance Arguments for Autonomous Systems

An assurance case can be defined as a structured argument, supported by evidence, intended to justify that a system is acceptably safe and secure. A defensible argument of acceptable risk is required as part of the regulatory process, with a certificate of assurance being granted only when the regulator is satisfied by the argument presented. As previously stated, results from Test, Evaluation, Verification, and Validation (TEVV) do not by themselves determine operational risk, imply certification, or give authority to operate. However, TEVV results provide the collected body of evidence that is presented to a certification board, and ultimately the milestone decision authority (MDA), to determine an acceptable level of safety, security, performance, and risk for that specific platform. The assumption is that no one method for verification and validation will be adequate for future autonomous systems. Therefore, not only do multiple new TEVV methods need to be employed to enable the fielding of autonomous systems; a new research area needs to be investigated in formally articulating and verifying that the assurance argument itself is valid. This topic focuses on the development of formal assurance cases for the purposes of analysis and reuse, providing a comprehensive argument that all requirements have been satisfied, including safety, security, performance, etc. Additionally, standard autonomy argument templates must be developed, enabling the reuse of explicit arguments of risk, performance, and safety, closely tied to autonomy requirements and TEVV practices that, if performed, provide an acceptable collection of evidence for an autonomous system.

 

2. Research into Formalization, Analysis, and Validation of Requirements, Architectures, and Models

This topic focuses on the fundamental research of formalizing requirements and architectures as well as the resulting models for autonomous systems. This topic area will involve the analysis of the correctness of system requirements that are, where possible, mathematically expressible, analyzable, and automatically traceable to different levels (or abstractions) of autonomous system design. This topic area includes the use of and/or generation of new methods and techniques for the analysis of model-based design architectures. There should be a focus on new and innovative procedures that lead to an advancement of the state-of-the-art in the analysis of complex and autonomous systems.

 

3. Application of formal design and analysis techniques and tools to a challenge problem for the certification of highly complex or autonomous systems

This topic covers the application of formal method techniques as they are applied towards a full system certification of an autonomous system. Though many formal analysis techniques have been around for years, it remains a challenge to provide techniques that drastically improve verification and validation early in the systems engineering design process. A focus of this topic area is to show the benefits of applying these techniques early in the design process by showcasing the results after having applied these tools to a complex challenge problem. Additionally, this area focuses on the acceptance of formal methods tools for use in verification and validation and the integration of these tools into a complete systems engineering development process. This topic may also identify key barriers and gaps in the current tools and methods in current regulations like but not limited to the DO-333 Formal Methods supplement to DO-178C and DO-278A guidelines.

 

4. Tool Demonstration of Verification and Validation Approaches

Several research tools and associated analysis methods are being actively developed and evaluated in order to perform formal and informal system verification and validation. Presentations in this topic area will consist of a brief introduction to a particular problem domain and associated tool for addressing the described challenge, followed by a live demonstration of the tool in order to address the introduced problem on a sample system. Of particular interest is any experience with the tool for detecting non-trivial system defects through the use of the analysis tool, and/or shortcomings discovered while using the tool or desirable additional features. Tools may address verification challenges in requirements formalization or consistency checking, software algorithm verification, software code verification, control-theoretic techniques which include interactions with software, hybrid automaton verification approaches, and system test-case generation.

 

5. Autonomous System Regulations and Certification

In order to meet the challenges posed by the world of tomorrow when operating in dynamic, complex, and/or contested environments, future autonomous systems will likely employ advanced technologies like learning/adaptive and self-governing algorithms. To alleviate some of the demands on the operator, these systems must have an increased level of autonomy. However, as highlighted in the Air Force’s 2010 Technology Horizons, “It is possible to develop systems having high levels of autonomy, but it is the lack of V&V methods that prevents all but relatively low levels of autonomy from being certified for use.” The current regulatory procedures in the DoD and civilian world do not directly support new and innovative verification and validation (V&V) techniques. In many case, certifiers have little to no experience with these advanced V&V techniques. This topic relates to the state-of-the-art of regulations and certification processes of autonomous systems and how they will need to change to enable future advanced capabilities. Some questions that could be discussed include, but not limited to, are: What are the unique certification needs of autonomy? What are the gaps in the current certification and regulation processes in addressing autonomy? How to we change the certification and regulatory processes to accommodate the tools, techniques, and procedures discussed in the other topic areas?

 

6. Verification and Validation of Human-Machine Interfaces and Protocols

This topic focuses on the investigation of formal methods – including theorem proving, model checking, hybrid systems theory, domain specification languages, and human behavioral and cognitive models – in the design and verification of autonomous and human-automation systems. This topic is interested in the generation of test cases for autonomous or human-automation systems, synthesis of “correct-by-construction” protocols or decision-making procedures based on human specifications or preferences, verification of human-machine interface properties, and use of formal frameworks for improving human-automation communication and joint decision-making.

 

7. Cumulative Evidence through RDT&E, DT, & OT  

Modeling and Simulation (M&S) and Test and Evaluation (T&E) at each Technical Readiness Level (TRL) and product milestone currently provide an invaluable resource not only to verify and validate that a system satisfies the user requirements, but also to aid in technology development and maturation. However, the development of effective methods to record, aggregate, and reuse T&E results remains an elusive and technically challenging problem. As an example, DoD Directive 3000.09 implies that autonomous weapons software, where possible, not be re-written, but incrementally developed and verified by sequential, progressive regression testing. It is paramount that products, methods, tools, and capabilities developed in verifiable requirements and design artifacts support the transition of autonomous systems to the DT and OT communities, to better define and, where reasonable, focus and increase the effectiveness of test and evaluation plans. Methods must be developed to record, aggregate, leverage, and reuse M&S and T&E results throughout the systems engineering lifecycle from requirements, to model-based designs, to live virtual construction experimentation, to open range testing. This topic also endeavors to highlight the research in the development of standardized data formats to encapsulate experimental results performed in early research and development, ultimately reducing the factor space in final operational tests. Additionally, statistics-based design of experiments methods currently lacks the mathematical constructs capable of designing affordable test matrices for non-deterministic autonomous software. Software systems require a risk-mitigation methodology offering the same spirit as Design of Experiments (DOE) but do not rely entirely on statistical approaches. Finally, this topic ultimately focuses on the design and reuse of M&S and T&E data to support the reduction of T&E costs throughout a design lifecycle.

 

8. Run Time Assurance for highly complex or autonomous systems

For the most demanding adaptive and non-deterministic systems, we may need even a more dramatic shift. Currently, we attempt to prove systems correct via verification of every possible state PRIOR to fielding of the system, through extensive and costly test and evaluation. However, for highly complex autonomous systems, an alternate method leveraging a run-time architecture must be developed that can provably constrain the system to a set of allowable, predictable, and recoverable behaviors, shifting the analysis/test burden to a simpler, more deterministic run-time assurance mechanism. This topic focuses on design implementations, analysis tools, and verification evidence that provide a structured argument, justifying that a system is acceptably safe and secure not only due to offline tests, but also through reliance on real time monitoring, prediction, and failsafe recovery. This topic covers technologies that leverage a run time assurance approach to enable online monitoring, steering, and assuring of potentially unverifiable adaptive or autonomous systems.

 

9. Other critical technologies and/or certification issues in safety and security applications  

This topic will be used as a collection of topics/demonstrations that do not neatly fit into any of the other topic areas but focus on certification related issues in safety and security applications.

 

Abstract Selection

If your abstract is selected for presentation at the 2017 S5 event, you will receive an acceptance letter via email, provided with additional instructions for your presentation preparation. You will be requested to submit final presentation materials in either electronic format or CD. If submitting your presentation on CD, please mail to: Sawdey Solution Services, Inc., ATTN: Ashley Kelly, 1430 Oak Court, Suite 304, Beavercreek, Ohio 45430. PowerPoint 2003/2007/2010/2013 are the preferred methods. However, presentations posted to the S5 website after the conference will be in Adobe PDF format. The final presentation must be received by 24 July 2017.

What is required if my abstract is selected?

  • A 50-100 word biography sketch of the presenter
  • Presentation/Demo for display during the event
  • Each presentation must not exceed 30 minutes - Note: this timeframe may vary depending on the number of presentations received. You will be informed of your alloted time in your acceptance letter. Abstracts that are accepted must present/demo at the time scheduled by the conference committee.
  • Each presenter must register on the S5 Registration website.
  • Important Note: Use of personal laptops for presentations is strongly discouraged due to the need for time savings and the possibility of problems with multiple computers switched in and out. All presenters are strongly encouraged to submit their PowerPoint presentation in advance. Conference staff can then have your presentation pre-loaded on the conference computer and ready to go at your designated briefing time. This will save critical time as each new presenter takes the floor. If you are planning and accepted to present a demo, please plan to work with the Conference Coordinator, Ashley Kelly, to plan and work through the technical details of your demonstration in advance.