ONLINE RESUME

OFF THE MARKET


MATTHEW SCOTT

Sr. EDA Engineer

 (469) 531-9756
This e-mail address is being protected from spambots. You need JavaScript enabled to view it
www.matthew-scott.com
 
SPECIALIZATION

Programming and support of world class systems, as well as diverse projects such as factory automation, distributed programming, knowledge management and machine learning systems.  Recently, applied to design optimization methods in Electronic Design Automation. 

EDUCATION

  •  M.S.E.E., University of Arizona. 2004, Focus on A.I., Expert Systems and Knowledge Management
  •  B.S.C.S., Indiana University. 1994, Focus on A.I., Neural Nets, Parallel Computation. Honors curriculum.

KEY KNOWLEDGE

  • Perl/Tk,  ie – wrote a 12,000+ line distributed Intelligent regression management system
  • Microsoft C,  ie a 10,000+ line distributed communications scripting system for 2500 users
  • Parallel C++,  ie a Parallel Genetic Algorithm for solving Load Distribution and balancing
  • Scheme, ie A genetic algorithm for evolving intelligent Turing machines (GA)
  • Lisp:  ie A Dempster-Schaeffer Theory of Evidence, and a Production Planning System. (lsp)
  • Cadence Skill:  ie a Psuedo Production System for Design Migration, Various utilities
  • UNIX csh/sed/awk, ie, various numerous scripts to automate simple tasks
  • Java 2+, ie,  A Java based IC Layout system with OO devices, nets and routing (JLE)
  • CORBA, ie A Distributed Design Environment (paper)
  • COBOL, ie a Distibuted Remote Data Terminal Communications Control System for GE  (ALMS)
  • Informix-SQL, various ‘TPR’s to interact with an massive Naval
  • Pascal, ie. A distributed airline scheduling system
  • BASIC:  ie Naval GPS Navigation, Planning and Prediction System
  • Prolog: Some use in tandem with UCPOP (Partial Order Planner)

EMPLOYMENT HISTORY

Intel Corp, Sr. Design Automation Eng.  ( 2010-20XX)

  • Lead Design Automation Engineer, Analog and MixSig Simulation.
  • Based on my first assigned project: "Audit Intel capabilities vs. others", have set the course for various advanced simulation methodologies, modeling and collaborative design in 14nm. Have made dozens of presentations and propositions on best practices, advanced capabilities and industry best in class tools ... and in particular, strategies for succeeding in the SOC world, leveraging Intel's massive, mind-bending resources.
  • Am the Lead DA and principle for one of the 6 groups developing a revolutionary capability that all others have failed at (proposed the project based on configuration of world-class tools, theoretical analysis and massive IP resources, passing through multiple levels of management). This project is deemed by management 'to change the way design is done at Intel world-wide', and will implement a 'culture change' that will enhance productivity over 50%.
  • Typically work 12-16 hours per day, weekends and holidays ... loving ever second of it!
  • Programmed in Skill an numerous circuit, environment and tool utilities 

Texas Instruments Dallas: Sr. Design Automation Eng. ( 1994-2009)

  • Programmed in Skill an ElectroMigration Calculator utility for Bosch GmbH. 
  • Programmed in Skill an Front-End for Thermal Analysis Calculator.  Extracts layout data, creates inputs for Therm tool.
  • Programmed in Skill an DB migration utility (Skill GUI + DB analysis ).  Checks DB integrity, facilitates migration ordering.
  • Programmed in Skill an Schematic Area Estimation & Placement Utility.  Enables early cost-estimates and floor-planning in-situ.
  • Programming and Implementation of Signal Integrity and Parasitic Analysis tools and methods:  Ensuring the accuracy and robustness of parasitic extraction and back-annotation.  Defining the S.I. flows/tools.
  • Programming and ownership of Process Design Kits (Integrated Design Environments for Circuit Design)
  • Programming (12000+ line Perl/Tk) of Regression Management Studio
  • Managed software development life cycle for all above: Study, Plan, Design, World-Wide Test, Implement and Support.
  • Programming in Perl/CGI, Java/CORBA of a design flow automation system.
  • Lead: Simulation Efficiency (500+ customers):  World-Wide champion of advanced simulation methods, simulation data management, and integration of analysis tools.
  • Work with systems and network administration on tool integration, compute, network operating-systems, auxiliary tools and design environment architecture requirements. 
  • Served as Project Lead and  mentor to a team of senior engineers in development of a front to back Design Kit.
  • Developed comprehensive strategic plan targeting design speedup, streamlining opportunities, and growth paths. 
  • Developed forecasting and analysis system for design projects needs on process, tools and EDA resources.
  • Coordinated and directed contractors, schedules, resources, contacts, deliverables.  

Sr. Systems Engineer, Computer Data Systems Inc., Rt. 1 Box 620, Crane, IN 47522.  1990-1994

  • Lead/Principle programmer (Microsoft C) of a distributed communications system with a scripting language and internal protocol drivers for asynchronous RS232, V2 and TCP/IP for use by 2500+ people on a Hughes System 2000 Broadband WAN.
  • Systems Analysis of Distributed Database.  Graph-theoretic analysis and optimization of data-base protocols.  Lead to complete overhaul.
  • COBOL programming in a SQL environment with TCL.
  • Secret Clearance

Systems Engineer, International Computer Services, (ICS), Bloomington, IN.   1988- 1990

  • Developed a COBOL distributed-computing network protocol conversion system on Micro-VAX s to support the wireless links for field operated Mobile Data Terminals to communicate with a centralized DB system.
  • Co-developed and implemented COBOL based a plant-wide assembly line unit tracking system for the GE-Bloomington refrigerator factory using laser scanners, motion detectors etc.   System improved the efficiency of materials delivery and unit tracking enormously.  Enabled JIT material delivery, as well as supply-chain production planning.
  • Secret Clearance

First Class Operations Specialist, U.S.N. and USNR, Indianapolis.  1983- 1995 (12 Years)

  • Systems Administration, Programming and operation of integrated tactical and logistic systems. 
  • Enhanced Ship s Relative-Motion Navigation system with Closest-Point-Of-Approach prediction and contingency routing scenario development. (BASIC & some trigonometry)
  • Developed Combat Theatre Simulated game scenarios.  Received accommodation thereof.
  • Recommended for Chief Petty Officer … missed due to downsizing post Cold-War
  • Secret clearance, D.I.S.

SELECT (INTERESTING) ACADEMIC COURSEWORKLost River High School,  Junior-Senior ’1981-1983

  • Taught Commodore 64 Basic programming to students (self-taught, wrote full ‘Space Invaders’)
  • Captain of Football Team.  Lost only in State Championships.

  MSEE Thesis: PDK QC, SI

VLSI Systems Design (C620 )

Artificial Intel. I & II  (Indiana )

VLSI & Robotics (C622) Design

Neural Computation
(Q550)

Parallel Computation 
(C690)

VLSI Verification 
(C690c Indiana)

Computer Networks (ECE578)

Computer Architectures

Genetic Algorithms 
(C690b)

Digital Design I & II ECE 470)

Machine Learning (C665 Indiana)

Comp Aid. Logic Design (ECE574a)

AI – Expert Systems
(ECE579)

Telecommunications Networks (ECE678)

Physical Design Automation VLSI (ECE672 UofA)

PUBLICATIONS AND WHITE PAPERS OF INTEREST

PERSONAL INTERESTS

  • Travel:  Have visited over 20 countries, having engaged in Humanitarian/Good-Will missions in most
  • Soccer:  Have been a  Striker  for numerous teams in leagues.  Played in 10+ countries.
  • Chess:   Love it as much for the mental challenge as for the history and AI computational intrigue
  • Quantum Consciousness, NN s and Hard AI:  Follower of the Kurweil Singularity


COVER LETTER ONLINE


 
 
Greetings!  I hope you are looking for a PDK, PVM and EDA specialist, with a proven record of excellence!
 
    My career started as a PDK Development and Support Specialist.  As EDA Project Lead for Burr-Brown Corp, I basically developed the PDK infrastructure and the EDA environment to support the standards thereof from scratch.   Besides structuring the libraries, project setups and tool installations to enable streamlined design, I also kept multiple organization and planning instruments to ensure optimal PDK/Tool/Project development synergies between the team-members.  To architect streamlined design flows, I initiated the strategy of Pcell development with Virtuoso-XL and IC-Craftsman (VCAR) for auto layout, and integrated everything into the Cadence environment with SKILL.  This included developing or modifying the symbols and netlisting functions to map to devices and spice models in various foundry technologies.  As well, the Pcells/Symbols and sub-circuits needed to be structured for LVS netlisting and post-parasitics re-simulation.  

   Training and presentations are my forte.  For example, I have authored hundreds of solutions and tutorials – spanning from advanced simulation strategies with Spectre Corners, Monte-Carlo and Parametric sweeps to meet Spec-Compliance and Yield, to a full-fledged half day RCX training course presented both in the USA and Germany.  I developed Burr-Brown's first Designer's Guide to a complete Process, including coverage of usage of all tools, libraries and design flows.

    Definition, planning, development and validation of Physical Verification Run-Sets has been my responsibility for 15 years.  I began writing Physical Verification rules decks since 1994, including multiple decks on various Processes & PDK's for Dracula, Diva, Assura and a bit of Mentor Calibre.  I lead the implementation of Parasitics extraction and validation through the use of TMA Raphael and QuickCap.  To facilitate QA of the LVS and Parasitics Extractions (and silicon-vs-simulation fit), I wrote a 12,000 line Perl/TK regression management system ... on my own time.  For this work, I was promoted to MGTS (Member Group Technical Staff).

    The hard science, down to the algorithmic theory, is as well a passion.  I have read easily over 300 papers on parasitics extraction methods and theory, in the process of completing my EE Masters thesis at the UofA.  As well, my addiction to self improvement drove me to complete 56 hours of advanced graduate work, with GPA's of 3.95 in Computer Science graduate work (and a chapter in an IEEE book on robotics), and a 3.44 GPA in Electrical Engineering Masters work.  For fun, I have started to write my own Java-based layout environment, based on theory and practice derived from an advanced class on Physical Design Automation.   Moving to Germany in 2004 finally put a cold-turkey damper on my academic habit.

    Leadership, teamwork skills, a positive attitude and consistent performance beyond highest expectations are reflected from my earliest years (Life Boy Scout, Captain of Football Team - undefeated until State Championship playoffs), to my 12 years with the US Navy/Naval Reserve and recommendation to Chief Petty Office.  As well, these reviews reflect my interest in humanitarian/goodwill efforts, where I spent pretty much all my off-duty time while overseas.

   Recently I have enjoyed writing SKILL for an Electromigration Calculator, a Thermal Analysis Front-End, and a CDB to Open-Access library migration utility.  These project required world-wide coordination of team-members, developing project plans, presentations, demonstrations ... and not a few late nights!

    You can find a recent resume here:  http://matthew-scott.com/.  As well, I share all of my performance reviews, recommendations, and a few projects.  In recent years, I have been promoted several times while with Burr-Brown and Texas Instruments: CAD I -> CAD III -> EDA Project Lead -> MGTS.  You can see from my EDA/CAD training curricula, that I have acquired extensive knowledge in just about every field of CAD. I am, in essence, an EDA nut-case. 

   I hope you have found these details useful in forming a character and skills fit reference, rather than painting me as self-gratuitous.  Having my own web-page and posting all this info is not my idea of fun or productive use of time - but is necessary in these times.

Thanks!

Matthew Scott

This e-mail address is being protected from spambots. You need JavaScript enabled to view it

 


RECOMMENDATIONS


http://matthew-scott.com/Recommendations

------------------------------------------------

 “Matthew did an excellent job of creating a GUI (using SKILL code in Cadence environment) for a "Thermal Calculator" application. He owned the entire task, and very rapidly fine-tuned as per user feedback. I was very impressed by his knowledge of SKILL and his initiative to carry the task to completion.” February 18, 2009

Vinod Gupta, Distinguished Member Technical Staff, Texas Instruments
managed Matthew indirectly at Texas Instruments

------------------------------------------------

“Matt and I worked to gether on CDB to OA Cadence database migration project. Matt is very creative in his work and his graphical user interface for this migration project was very well received across TI. He's an excellent team player and his feedbacks helped improve the project efficiency. Matt is an expert in Cadence SKILL program and I would strongly recommend him.” February 4, 2009

Padman Sooryamoorthy, Staff Application Engineer, Cadence Design Systems
worked with Matthew at Texas Instruments

------------------------------------------------

“I worked with Matthew in the EDA group at Texas Instruments. I could always count on Matthew to help me with testing VAVO [electromigration tools] before an internal software release to make sure we were not missing any known previous issues. One of Matthews main projects whiel in the group was to develop a flow and code up the GUI for migrating projects from IC51 to IC61. He handled the critiques well and was able to integrate feature requests and direction changes well on a short schedule. I enjoyed working with Matthew. His experience made him a good candidate to bounce ideas off of when developing and improving flows. He was always willing to take a few minutes out of his schedule to help coworkers with questions they had.” July 30, 2009

Justin Gedge, EDA / CAD Engineer, Texas Instruments
worked directly with Matthew at Texas Instruments

------------------------------------------------

“Matt and I were members of the EDA team at Texas Instruments Freising. Working with him was a great experience, since he has an outstanding IT and EDA knowledge. He has the ability to apply it in time even in case of very complex project requirements. This is why he was a sought-after development partner within TI Freising´s analog/mixed-signal development groups, like MSP430, Automotive, DCDC and High Frequency Circuits.” July 28, 2009

Achim Bauer, EDA support engineer, TI
worked directly with Matthew at Texas Instruments Germany

------------------------------------------------

 “Matt is an extremely intelligent and capable person- he enthusiastically takes on new and complex assignments, will latch onto a project and attack it full-on, and works hard to see it through to completion. He is continually updating his professional skills and knowledge, isn't afraid to put in long hours when needed, gets along with everyone, and would be a huge asset to any company. I highly recommend him as an employee.” May 11, 2009

Peter Belleau, Design Engineer, Burr-Brown Inc. worked with Matthew at Burr-Brown

------------------------------------------------

“I worked with Matt for 2.5 years in a the same EDA group of Burr-Brown Corporation (Texas Instruments). Matt was playing very important role in the EDA group. He was multi-tasking. He was developing PDKs, trouble-shooting technical problems, and providing training to designers. Matt is a very talented, hard-working and skilled EDA engineer. Also, he is a great team player. No doubt he can play a very important role in success of any organization.” February 12, 2009

RAJESH RAJPUT, Senior PDK Developer, Texas Instruments
worked directly with Matthew at Burr-Brown

------------------------------------------------

I first met Matt when I started at TI in the position of EDA manager and he was assigned to report to me.   Matt is one of the most creative and diligent EDA engineers I have ever worked with. He works long hours, documents his progress in uncommon detail and methodically approaches solutions he is assigned to accomplish. While he reported to me, he discovered the need for a comprehensive system that I choose to think of as a characterization system although it did much more. After receiving input from all the design managers, he developed the system in a very short period and those managers were very happy with his project.

I would highly recommend Matt for a position requiring creative use of CAD tools, especially in the Cadence environment. He works very well under pressure and is dedicated to further advancing users productivity for integrated circuit design with vendor tools.

David Vaughn, Ph.D. (Prior EDA Manager, Texas Instruments Tucson)

------------------------------------------------

    “Matt [was] the cornerstone of the EDA organization at Burr-Brown as he took on another very large workload during the past year. Among the most important accomplishments was Matt’s role in recruiting and mentoring additional EDA support personnel.  Momchil Milev and Ron Dionne have already made a significant impact on the EDA environment at Burr-Brown.  He also helped in recruiting the new EDA Director and another experienced EDA Engineer (Rajesh Rajput).  Matt was instrumental in developing an overall EDA methodology that will take Burr-Brown into the next millennium including Pcells, Diva, IC Craftsman, Silicon Ensemble, and Analog Artist and he will provide the essential support as we fully implement this more modern methodology over the next year. A major task that Matt took on was to coordinate the DSM Technologies contract to train our Layout personnel on IC Craftsman.  Much more training will be needed, but this is a huge step towards our goal. Matt has an outstanding overall understanding of what we are trying to accomplish with a more modern EDA methodology and his inputs have been key in developing this plan. He is recognized as a leading expert within the design community on a wide range of tools and this knowledge has helped us keep up with an ever increasing product introduction objectives for all divisions.  Matt gets along well with co-workers and vendors and is always willing to do whatever it takes to get the job done.  He continues to work late into the night to complete tasks such as LVS that are needed to keep our many products moving forward.  About the only real problem is that there is not enough of Matt to go around and some design engineers are frustrated that matt cannot solve all of their issues immediately.   He was also instrumental in working with me to establish a weekly EDA/Layout communications meeting that has greatly enhanced the communications between all 3 layout organizations and the EDA group.” 

 

Paul Prazak, V.P. High Speed Products Division, Burr-Brown Corp.

 


PERFORMANCE REVIEWS


: http://www.matthew-scott.com/data/Perf_Revs.htm

 

1992 Performance Review and Recommendation for Chief Petty Officer, USNR, Indianapolis, IN

OS1 (MATTHEW) SCOTT’S SUPERB PERFORMANCE IN PLANNING AND TRAINING FOR THE EXECUTION OF CONVOY EXERCISE RAINBOW GULF 92-2 DIRECTLY CONTRIBUTED TO ITS OUTSTANDING SUCCESS.  AN INTELLIGENT, ACCURATE AND FAST WORKING OPERATIONS SPECIALIST … 4.0/4.0

OS1 SCOTT IS UNRESERVEDLY THE MOST IMPECCABLE ENLISTED WATCHSTANDER IN THIS EXERCISE.  HE HAS MY CONFIDENCE IN BEING ON THE BRIDGE DURING THE MOST CRUCIAL PHASES OF CONVOY SHIP-HANDLING.  HE IS THEREFORE MOST STRONLY RECOMMENDED FOR PROMOTION TO CHIEF PETTY OFFICER.

 

- Captain Fitzwilliam, P. K.

1994 Performance Review, Cathy Poole, MGR, CDSI Crane Indiana

Matt has proven himself to be an excellent programmer. He has been able to take responsibility for some existing projects without any real loss of continuity. This is quite an accomplishment when the complexity of these projects is considered. The quality and quantity of Matt’s work exceeds any Company or client expectations. The client respects Matt’s abilities and talents.  He has always done whatever it takes to ensure client satisfaction. Matt has always willingly accepted all assignments. He is able to adjust to changes in priority and can work on several assignments at once. Matt maintains a high professional standard of behavior and performance. Matt is a true asset to the Crane CDSI organization.  His dependability and versatility add to his worth. Thanks for a job well done Matt, and keep up the good work.  Your efforts are recognized and appreciated. –  Cathy Poole, Site Manager, CDSI.

PROMOTION: SYSTEM ENGINEER -> SR. SYSTEMS ENGINEER

 

1995 Performance Review, Dewitt Ong, Ph.D., EDA Manager, Tucson

Matt has brought order and discipline in the area of design verification tools. Matt is very customer oriented.  When a flow is urgently needed, he works long hours to deliver the product in a timely manner. He also spends considerable time helping the design engineers use the design verification tools. Matt works well with the layout group … Matt also works well with design engineers, accommodating their requests in a manner that is satisfactory to both parties.

OVERAL RATING: SIGNIFICANTLY ABOVE (highest)  - Dewitt Ong. Ph.D., Manager EDA

 

1996 Performance Review, Richard Clark, CAE Manager, Tucson  

Mr. Scott has achieved a very high level of productivity through this last year.  Not a single complete LVS run (final) is performed without Mr. Scott’s input to insure they are as error-free as possible. His work on C23 with regards to Pcell generation and Skill Code will push CAE + CAD to performance levels not seen before. Mr. Scott has completed several graduate level courses that directly apply to his position and is progressing towards an MSEE degree.  He has attended a Cadence SKILL™ programming course several months ago and is now using that language to customize the menu environment of Cadence for CAD, to ease LVS runs, and to ensure that the correct rule sets are always used.  Matthew has a paper being published in an IEEE book on Robotics. (Stiquito, a Platform for Artificial Intelligence)  Mr. Scott has to juggle many different development goals + documentation, with his role as ongoing support for design and CAD tools.  The latter is always the priority, but Matt has put in an incredible number of hours in the weekends and evenings.  Because of this, everything manages to get done.  Matthew maintains a list of problems he finds well in advance of what the the Design and CAD people, and strives to correct those before them become actual ‘glitches”. [i.e. preventive rather than reactive firefighting]

OVERALL RATING: SIGNIFICANTLY ABOVE – Significantly Above  (highest)  - Richard Clark, CAE/EDA Manager.

1997 H1, Performance Review, Richard Clark, CAE Manager,

Matthew has dedicated an incredible number of hours to insure that many previously neglected or ignored parts of design, CAE and CAD are documented procedures that are easily understood and followed.  From the outline of the projects that Matthew updates weekly, it is obvious that he is involved in all facets of support for CAE and CAD.  He has a much better understanding of the interface between layout and design than they do.  Matthew’s course work towards his MS EE directly applies to his work in CAE.  He stays abreast of current developments in the Design/CAD tool arena, and is active in searching for better tools for BB. 

Matthew is exceptional at planning and prioritizing his work, and anticipating problems.

We depend on his judgment on many critical problem areas.  Matthew is results oriented, and is always looking for a better solution.

OVERALL RATING: SIGNIFICANTLY ABOVE.  – Richard Clark, CAE/CAD Manager

 

1997 H2, Performance Review, Paul Prazak, High Speed Products V.P.

(Matthew) Brought up Pcells for TSMC0.6u.  Implemented DIVA on TSMC0.6um and P43x (Bipolar).

Created Dracula DRC/LVS/LPE/PRE run sets for several processes. (Matthew has) been a key resource in EDA recruiting and interviewing.  Heavily responsible for the successful recruiting of Momchil Milev … Matt is developing ever better skills as an EDA engineer. Matt puts in extra effort to accomplish objectives.  During this year Matt’s ability to complete projects to the 100% level has been significantly hampered by the enormity of the task.  However, Matt still provided excellent support for solving short-term issues that arose.  Forthcoming EDA staff are expected to boost Matt’s output significantly since they will share in the day-to-day short-term problem solving issues.

Matt takes direction well.  Mate relates well to all personnel in general.  Because of his positive attitude, Matt is readily drawn upon (by designers) to solve EDA issues.

OVARALL RATING: FREQUENTLY ABOVE – Paul Prazak, V.P. High Speed Products Division.

 

1998 Performance Review, Paul Prazak, V.P. High-Speed Products, Tucson

Matt continues to be the cornerstone of the EDA organization at Burr-Brown as he took on another very large workload during the past year.

Among the most important accomplishments was Matt’s role in recruiting and mentoring additional EDA support personnel.  Momchil Milev and Ron Dionne have already made a significant impact on the EDA environment at Burr-Brown.  He also helped in recruiting the new EDA Director and another experienced EDA Engineer (Rajesh Rajput).

Matt was instrumental in developing an overall EDA methodology that will take Burr-Brown into the next millennium including Pcells, Diva, IC Craftsman, Silicon Ensemble, and Analog Artist and he will provide the essential support as we fully implement this more modern methodology over the next year. A major task that Matt took on was to coordinate the DSM Technologies contract to train our Layout personnel on IC Craftsman.  Much more training will be needed, but this is a huge step towards our goal. Matt has an outstanding overall understanding of what we are trying to accomplish with a more modern EDA methodology and his inputs have been key in developing this plan. He is recognized as a leading expert within the design community on a wide range of tools and this knowledge has helped us keep up with an ever increasing product introduction objectives for all divisions.  Matt gets along well with co-workers and vendors and is always willing to do whatever it takes to get the job done.  He continues to work late into the night to complete tasks such as LVS that are needed to keep our many products moving forward.  About the only real problem is that there is not enough of Matt to go around and some design engineers are frustrated that matt cannot solve all of their issues immediately.   He was also instrumental in working with me to establish a weekly EDA/Layout communications meeting that has greatly enhanced the communications between all 3 layout organizations and the EDA group.

OVERALL RATING: FREQUENTLY ABOVE - Paul Prazak, V.P. High Speed Products Division

PROMOTION: EDA PROJECT LEAD

 

2000 Performance Review, Director EDA, Tucson 

Matt has a talent to analyze very broad and complicated issues and to break them down to the detail level.

He has deep knowledge of all Cadences tools, process flows, and physical layout construction.

Matt has developed templates for developing and testing LVS.  Matt also has a very broad knowledge of many other fields.

Matt works well with the other EDA staff.  He also works with CAE to solve global issues.

Matt has worked very hard towards the new goals.  As a Project Lead, Matt needs to continue to develop his rapport with the engineering community.

OVERALL RATING: FREQUENTLY ABOVE - EDA Director

 

Director EDA, Tucson

Matthew Scott is the EDA Project Lead and a Senior EDA engineer at TI-Tucson.  The EDA profession demands a multi-disciplinary combination of skills in EE, CS, business and diplomacy … skills which Matt honed in his previous positions as Asst. Project Coordinator (’87-’88) at GE developing a distributed/wireless communications system, as Project Lead at CDSI (’90-’94) developing another distributed communications system, and as a Combat Information Center Supervisor in the U.S. Navy / USNR.

(’83-’95).  Collaterally, EDA demands continuous education to keep ahead of the technology. As such, Matt has compiled a curriculum-vitae listing more graduate-level EE-CS work than required of a Ph.D. and which includes many professional courses and conferences.

                In the review below, please evaluate the contributions keeping in mind the ‘wild-west conditions of EDA in Tucson at Matt’s hire date in ’94, such that at the time our environment consisted of specialized and diffuse symbol sets, scattered model sets, a lack of defined device libraries, no centralized EDA documentation system and the resulting impossibility of performing layout vs. schematics equivalence checks, not to mention the barriers to design re-use.  Matt was, for most of the years ’94-’98, the sole EDA hand, and as such had to simultaneously accommodate designers’ creativity while trying to develop and gradually standardize the multiple flows of some 13+ processes, and not to mention supporting the existing proliferation of creativity. The following bullets should summarize:

-          Created EDA standard directory system for process tools, cell, symbol, model and verification library standardization ‘94

-          Created management system of Process Design Kits, including library control lists, development schedules, tracking sheets. ‘95

-          Pioneered EDA Intranet, Knowledge Management initiatives.  Intranet is now critical to communications! ’95-‘01

-          Developed documentation systems to present organized PDK’s to design community

-          Motivated Design Archiving system. Archive system based on Perl/CGI/VB. Enable design rework several times. ‘97

-          Compiled standard symbol sets from best cases across numerous sources, for both Workview and Composer. ‘95

-          Coordinated with modeling group to create standard models directories and simulation interface configurations. ‘95

-          Coordinated with layout, design council, to compose and validated standard devices sets for all processes (P43X, TSMCx etc.) ‘95

-          Compiled the 220 page P43X Designers’ Guide, which was heralded as the first-ever designers’ guide at Burr-Brown. ‘95

-          Developed Dracula LVS Design Verification for P43X, P44X, P45X, CBICU2, TSMC0.6 etc. 1000% speedup over old methods.

-          Created Digital Cell library dynamic Intranet data sheet creation system with Perl/CGI. Created Auto-Flow Generators in Perl. ‘99

-          Development of IC-Craftsman analog auto place-and-route.  Layout reports reduction of 3 month layout to 3 weeks. ‘98

-          Evaluation and implementation of Analog-Artist tools, models configuration, demo to engineers.  Productivity maybe 10x better. ‘98

-          Development of Diva LVS system for tsmc0.6um. Runs 20x faster than Dracula. 2x faster debug. Extracts data for statistical simulation. ‘00

-          Developed Dracula and Diva parasitics extraction and back-annotation capability with cross-probing for 6 processes ‘98

-          Motivation and Development of Bbsym_4.0 symbol library for direct migration into Analog Artist from existing designs. ‘99

-          Planned and managed migration from ViewLogic Workview onto Cadence to enable integrated design environment. ’97-‘99

-          Assistance in resolution of numerous Design Verification problems. Numerous late nights to expedite tape-out. ’94 – ‘01

-          Championed an integrated, correct-by-construction analog flow using Analog-Artist, Pcells, IC-Craftsman and Diva, ’98-‘01

-          Recruited Momchil Milev and mentored him on the BB EDA methodologies, quickly bringing him up to speed.  Matt recognized Momchil talents and assigned him to the CBC10 EDA initiative, and guided the Correct-By-Construction development of Pcells (as presented in Matt’s ’99 EDA Strategic Plan). 

-          Recruited the technical expertise of Ron Dionne, whose contributions in the PC, web and workview environment have been vast. 

-          Provided guidance and mentorship to Rajesh Rajput, originally assigning him the OKI2um process, and bringing him quickly up to speed on the work in progress. 

-          Continues to provide a focal point of comprehensive design kit cross-testing and PDK documentation web pages.

-          Assigned Project Lead, EDA in Feb. ’98 and his role was emphasized as a mentor, wherein he continuously provided ideas and development analysis to his supervisor and colleagues.  Matt has presented or contributed to many internal classes given on EDA tools.  He has presented various papers to the Design Council on possible future EDA paradigms as developed in his research.

-          Presented EDA Design Environment, methodologies to UofA Staff/Students in voluntary seminar.

-          Participated in UofA ECE student activities, courses beyond the level of MSEE.  Assisted staff with Cadence setup for ECE dept.

 

PROMOTION: CAD ENGINEER 3 -> CAD ENGINEER Member Group Technical Staff


“I worked with Matt for 2.5 years in a the same EDA group of Burr-Brown Corporation (Texas Instruments). Matt was playing very important role in the EDA group. He was multi-tasking. He was developing PDKs, trouble-shooting technical problems, and providing training to designers. Matt is a very talented, hard-working and skilled EDA engineer. Also, he is a great team player. No doubt he can play a very important role in success of any organization.” February 12, 2009

-          RAJESH RAJPUT, Senior PDK Developer, Texas Instruments

David Vaughn, Manager EDA, Tucson

 I first met Matt when I started at TI in the position of EDA manager and he was assigned to report to me.   Matt is one of the most creative and diligent EDA engineers I have ever worked with. He works long hours, documents his progress in uncommon detail and methodically approaches solutions he is assigned to accomplish. While he reported to me, he discovered the need for a comprehensive system that I choose to think of as a characterization system although it did much more. After receiving input from all the design managers, he developed the system in a very short period and those managers were very happy with his project.

I would highly recommend Matt for a position requiring creative use of CAD tools, especially in the Cadence environment. He works very well under pressure and is dedicated to further advancing users productivity for integrated circuit design with vendor tools.

    -    David Vaughn, Ph.D. EDA Manager  (managed Matthew at Texas Instruments)

 

------------------------------------------------
Performance Review Excerpts

  • An excellent programmer (using either SKILL, perl, csh, Java or C/C+)
  • Regularly exceeds expectations
  • Exhibits a high professional standard of behavior and performance
  • Very customer oriented, works well with the layout and design engineers
  • Presents a positive attitude
  • Enthusiastically takes on large complex tasks
  • Results oriented, exceptional at planning and prioritizing his work, and anticipating problems
  • Works long hours to deliver the product in a timely manner - Kathy Poole, Mgr CDSI
  • Works late into the night to complete tasks  - Dewitt Ong, Ph.D
  • Has put in an incredible number of hours  -  CAE Mgr.
  • very high level of productivity - Paul Prazak, VP High Speed Design
  • Involved in all facets of support -  EDA Director.
  • Deep knowledge of all Cadences tools, process flows, and physical layout - EDA Director
  • Recognized as a leading expert within the design community and the go-to guy on a wide range of tools
  • Has recruited and mentored a number of EDA staff

 


ACADEMIC CURRICULUM VITAE - EXPANDED


http://www.matthew-scott.com/data/Diplomas.htm

 

Indiana University, Bloomington Indiana.  Graduate courses towards the degree of Masters of Computer Science

Course No/Name

Hrs

Grd

Project Description

CSC690 Parallel Computation

3

A

Parallel C++, Distributed fast fourier transform algorithm

CSC622 VLSI Design

3

A

Designed Blackjack Dealer + Layout in Magic + LVS/DRC

CSC690 Genetic Algorithms

3

A

Parallel C++ Distributed Genetic Algorithm, Quadratic Assignment Problem

CSC665 Machine Learning

3

A

Neural Net Paper, Genetic Algorithm Minimal Deception Problem

CSC690 VLSI + Robotics

3

A

Designed Phase-Coupled Hexapod Controller, resulted in an IEEE Book!

CSC690 VLSI Verification

2

A

Verified Phase-Coupled Neural Hexapod Controller, LVS + DRC + Logic

Q550 Neural Computation

3

B

Computational Analysis of Supervised vs. Delta Rule Neural Network Training .

Q750 Cognitive Science

2

A

Colloquium, invited speakers on prominent topics in Cognitive Science topics

 

22

3.95

 

 

University of Arizona, Tucson Arizona.  Graduate courses towards the degree of Masters of Electrical Engineering

Course No/Name

Hrs

Grd

Project Description

ECE672 CAD Alg for VLSI

3

A

Auto Place and Route algorithms       (S 96)

ECE574A Comp Aided Des

3

B

Logic Synthesis, Mealy/Moore ASMs            (F 96)

ECE574B Comp Aided Des

-

+

Design timing and power analysis      (S 96)

ECE578 Computer Networks

3

B

Paper on GA-NN routing opt             (F 97)

ECE695 Multimedia Systems

1

+

Multimedia Development. Tools        (F 97)

ECE678 Integrated Tele. Net

3

A

Advanced networks, MM, Java          (S 98)

MIS581 Internet Commerce

3

A

Network tools and Internet Commerce          (S 98)

MIS580 Knowledge Mgmt

3

B

Expert System in Business Modalities            (F 98)

ECE579 Expert Systems

3

B

AI algorithms (A*, min/max, DS.)     (F 98)

ECE566 Knowledge Sys Eng

3

B

Advance computer networks              (S 99)

ECE677 Dist. Comp. Sys.

3

A

Distributed Intelligent Agents                        (F 99)

ECE910 Thesis

3

A

Dist Comp & Agent Based Systems  (F 99)

ECE910 Thesis

3

A

VLSI Digital Cell generation              (S 00)

 

34

3.44

 

 

Completed 22 graduate hours of work from Indiana University, plus 34 from the University of Arizona, or a total of 56.

  • MSEE Thesis: PDK QC, RCX, Error Propagation Matthew_Scotts_Thesis_Final.htm
  • Artificial Intel. I & II  C665  (Indiana '89) graph3.s  ga-ro.s
  • VLSI & Robotics (C622 Indiana '92) C622_VLSI_DESIGN.htm
  • Neural Computation  (Q550 - Indiana '93) Q550_NN.htm
  • Parallel Computation  (C690 - Indiana '91)  pc++ matrix
  • VLSI Verification  (C690c Indiana '92): Validation of Physical Layouts by DRC, LVS
  • Computer Architectures (Indiana '88) pC++/Sage++
  • Genetic Algorithms  (C690b - Indiana '94) Parallel Genetic Algorithms ga.s pga-2.c
  • Digital Design I & II (Indiana '89) : Designed of a PDP-8 Bit Slice Microcomputer
  • Machine Learning (C665 Indiana '93) ttt.s
  • Computer Networks (ECE578 UofA '97) javanoginn – an agent-based routing system
  • Comp Aid. Logic Design (ECE574a – UofA '97): Multi-Dimensional Karnough Maps for Logic Synthesis
  • Expert Systems (ECE579 UofA):  UC Partial-Order-Planner
  • Artificial Intelligence (ECE566 UofA) Theory of Evidence for Circuit Analysis dempster.lsp  puzzle.lisp  
  • Integrated Telecommunications Networks (ECE678) Distributed Design Environement
  • Physical Design Automation VLSI (ECE672 UofA'95) :
    • Kernighan-Lin Fiduccia-Mattheyses: FM code
  • Knowledge Management (MIS580)
    • Coordination of Distributed Problem Solving in Multi-Agent Systems (here)
    • Intelligent Agents for Distributed Design Automation (here)


EDA/CAD TRADE COURSES - EXPANDED


http://www.matthew-scott.com/data/Certificates.htm

 

Affirma_Analog_Artist_2000

Diva_Interactive_Verification_1998

IC_Craftsman_1997

SKILL_Certificate_1996

Verilog_Simulation_Certificate

Virtuoso_AMS_Designers_2005

Dracula_Certificate_1994

Verilog_Certificate_2000

Verilog-A_Certificate_2007

 

 

      MENTOR CALIBRE LVS/DRC CLASS

            6 DAYS, ONLINE

 

·  Cadence Dracula Course San Jose, '94:  Implemented a dozen Dracula flows for Burr-Brown processes

·  Cadence 'Skill' Programming  '95 : Wrote numerous Skill Utilities,

·  Cadence IC-Craftsman Course. '97:  Purchased by Burr-Brown after evaluation and demo. Implemented and rolled-out

·  Cadence Diva Course, '98 : Converted all Dracula flows to Diva, added Diva 2.5D LPE

·  Cadence Analog Artist Course '00 :  Rolled out ADE usage to designers, initiated conversion to Spectre

·  Cadence Verilog-A/AMS tutorial '00. :  Created TI Internal Verilog-A re-use library.  Supported use.

·  Cadence Assura Verification Tutorial '00:  Converted TSMC flows from Diva to Dracula, implemented several other flows

·  Cadence ROD Pcells Course '00

·  Cadence Verilog Course Austin, TX, 2001. 1 Wk

·  Cadence Assura Course, Dallas, 2001

·  Cadence AMS (Analog-Mixed-Sig) Course, April ‘05

·  Cadence Verilog-A Course, Jan ‘07

   MENTOR CALIBRE LVS/DRC CLASS 6 DAYS, ONLINE

 

 


PROFESSIONAL PAPER: PDK & PVM VALIDATION


http://www.matthew-scott.com/prj/Matthew_Scotts_Thesis_Final.htm

http://www.matthew-scott.com/prj/ISQED2003A5FLAT.htm

 

Methods and Framework for QA of Process Design Kits

M. C. Scott

M. O. Peralta

J. D. Carothers

P. Koch

Texas Instruments

Texas Instruments

University of Arizona

Texas Instruments

Design Automation Group

Device Modeling Group

ECE Dept

Productization

6730 S. Tucson Blvd

6730 S. Tucson Blvd

P.O. Box 210104

12500 TI Blvd.

Tucson, AZ 85716

Tucson, AZ 85716

Tucson, Arizona

Dallas, Texas

This e-mail address is being protected from spambots. You need JavaScript enabled to view it

This e-mail address is being protected from spambots. You need JavaScript enabled to view it

This e-mail address is being protected from spambots. You need JavaScript enabled to view it

This e-mail address is being protected from spambots. You need JavaScript enabled to view it

 

Abstract

In this paper, we evaluate the dependencies between tools, data and environment in process design kits, and present a framework for systematically analyzing the quality of the design tools and libraries through the design flow.  The framework consists of a regression engine which executes sets of tests in a distributed computing environment.  These tests vary from simulations to validate models and simulators, to tests on layout versus schematics, parasitics extraction accuracy, and ultimately, tests to validate the extracted circuit integrity against the ideal.  In particular, it is shown that test-chaining is required to obtain confidence in the simulation-to-silicon equivalence.  A secondary objective is to identify and quantify the peak-error injection points.

1. Introduction

The process of electronic design depends critically on the quality of electronic design automation tools and the integrity of their underlying libraries and environments.   Conceivably, an error in any stage of the design process may propagate and expand further down the design flow leading to a critical fault.  The existence and nature of such errors is hidden to the designer, and thus the designer must work on a basis of confidence in the tools and libraries.  Idealistically, the designer need only consider the process-temperature-voltage corners and signal noise in defining the envelop of operation of a design.

In this age of increasing complexity, shrinking geometries, higher frequencies, lower power and shortening market entry opportunities, it is already prohibitive to obtain design closure on signal integrity issues, not to mention contending with design kit tool and library errors. But, these two concepts: Design Kit Integrity and Design Complexity go hand-in-hand.  Increasing demands on design performance and increasing susceptibility to signal integrity issues drive the design kit to increasing complexity. Likewise, the design kit complexity leads to exponential complexity in design susceptibility to kit errors.

In this paper, we present a systematic means of qualifying various stages and components of a design kit. Various cross-stage, cross-tool tests-chains propagate confidence. We are also looking to quantify the peak-errors of various stages and devices, and to identify opportunities for accuracy and efficiency improvement in both the design kit development process and in the design process itself. The paper is organized as follows.

      In section 2 we evaluate the dependencies between tools, data and environment in a process design kit. The concerns of the design process and its error injection points must be considered when developing simulation and extraction tools in order to neither under or nor over-design the tools. Eventually, this knowledge will improve corners definitions and Monte-Carlo (MC) simulations.

      In section 3, a framework for systematically validating the quality of the design tools and libraries through the design flow is presented. A tool "RegMan" (Regression Manager), is introduced which encapsulates the regression systems for validation of verification tools employing the distributed processing of jobs over LSF [3].

      In section 4, Front-end device models and simulators, the methods for cross-checking models, schematics and simulation are presented.  The RegMan tool is extended to run regression sets of the defined simulations and evaluations through calls to simulation scripts.

      The Fifth section delves into the complexities of validation of Physical Verification tools such as DRC and LVS.  Regression naturally fits in this role as there are numerous tests to perform, and the evolutionary nature of the kits demands frequent re-runs of the test suites.

      Section 6 discusses the use of the RegMan tool to execute various evaluations of a large set of parastics extractions of layout structures.  Comparisons of the resulting extracted netlists of a large suite of layout structures is made with respect to an industry standard 3D interconnect simulator; Raphael  [5] from Synopsys.

      Section 7 wraps up the Physical Verification validation with a system for comparing simulations of extracted netlists to the ideal schematic simulation.  The stage is motivated by the necessity to bind the layout to the original simulation - a more aggressive LVS.

      In conclusion, sections 8 present a review of the merits of this system and its results.

2. Design Flows, Kits and Complexity

 

These aforementioned complexities can roughly be partitioned through the design flow vertically into digital and analog centric. A major motivation of this project is addressing the various incarnations of signal integrity in many design types and attempting to evaluate, categorize and link these requirements in order to provide feasible solutions given the tools involved.

http://www.matthew-scott.com/prj/ISQED2003A5FLAT_files/image002.gif

Fig 1.  A 'simplified' design flow from Old days.

 

The design flow we evaluate principally consists of five stages:  spice simulation,  layout,  layout verification, extraction, and back-annotated re-simulation.  There are innumerable ways this flow may be constructed and extended.  Figure 1 represents one implementation. There are multiple spice simulators, multiple layout tools, multiple DRC, LVS, and LPE tools. The device and cell libraries must work in all of them.  Thus, the design kit development process is a bit involved.  Many concerns must be taken into consideration and made to fit with the total kit objective. This leads to multiple re-works and gradual improvement.  Given the interdependencies between libraries and tools, changes in any part may create faults propagated to others.  A testing and sign-off quagmire results.  The design kit complexity is directly driven by the design complexity and its need for multiple tools and flows, and their interdependencies.  Thus, the design kit development must be thoroughly researched and pre-planned based on design requirements, process technology capabilities, available tool suit, and planned design flows.

2.1 Digital Complexity

In the digital realm, design complexities are being dominated by interconnect parasitics and their signal-integrity degradation effects which introduce non-linearity’s into the numerical optimization solutions provided by synthesis, placement and routing tools. The increasing dominance of variable interconnect effects over the static cell delays (gate delays) effectively diverges the synthesized timing estimate from the eventual physical.  Numerical methods therefore become less viable and the designer is forced into a psuedo-convergent state of iterations.  The standard solution is back-end layout parasitics extraction (LPE) and transistor level simulation.  The parasitics may either be directly simulated in spice, or back-annotated into the Verilog RTL simulation. This project provides a means for evaluating and validating LPE tools such as to provide the accuracy to enable better cell library characterization, better interconnect back-annotation, and thus smoother progress to convergence.

2.2 Analog Complexity

In the analog realm, the push towards higher frequencies and finer signal resolutions has demanded similar improvements in simulation and signal quality, such as signal matching. But, analog has a much more diverse bag of tricks than digital, which must be meticulously extracted to enable the analog formulae to work.  Usually analog is not limited by the size complexity of digital, but rather the specification of constraints and the ability to optimize on those constraints in silicon.  Critical to analog design is the accuracy of models and simulation, and the accounting for parasitic effects on sensitive lines in the layout.  Knowledge of the accuracies of models and knowledge of the accuracy of parasitics extraction can help immensely in determining the degree of simulation coverage necessary to ensure performance.

2.3 Design Flow Error Propagation

The presented expansion in complexity, speed, size, range and SI sensitivity of circuits is a problem that every engineer is aware of.  The increases in all of these areas are plagued with inter-dependencies which are connected through the signal integrity realm [2]. Each engineer is also acutely aware that there is a vast matrix of economic trade-offs in optimization of various parts, and likewise that there exists a chain of error introductions and variances within which they must exercise their design to ensure an envelop of operation during its expected lifetime. The typical engineer is not, however, empowered with the information or tools to globally account for all of these factors in their design planning and analysis.  They can only set certain parametric goals and then hope that through use of various point tools and many iterations through the design flow they might converge to an acceptable solution with reasonable hope that all likely faults have been found.  This work is also done with the assumption that the underlying design environment is correct and not changing, thereby allowing for a controlled-experiment environment.  Importantly, the design engineer needs to have knowledge of the error levels of different tools, libraries, devices and methods.  A means is necessary for  the management and satisfaction of design constraints across tools and flows.  This must be provided by rigorous analysis of each stage, and development of an inter-tool constraint management utility.  The system we present is integral to the development of such a managed system. 

3. A Regression Manager

Regman (Regression Manager) is a graphical tool- control framework designed to facilitate the distributed processing (LSF) automation of simulation, physical verification and parasitic extraction runs.  It is generalized to work on multiple design kits and with various sets of physical verification rules by use of wrappers and interfaces, as in [1].  The RegMan flow is in Figure 2.

http://www.matthew-scott.com/prj/ISQED2003A5FLAT_files/image004.gif

Figure. 2  RegMan Design Kit Validation Flow

It takes as input a comma separated vector formatted file describing test cells and their mode of pass/fail expectation.  Outputs for physical verification validation (DRC, LVS) includes various reports on OK or NOK (not-ok) tests, matches, mismatches and failed runs.  For the parasitics extraction, reports on accuracy of the extract tool versus an industry standard, Synopsys' Raphael, are reported.  On the simulation regression tests, results are compared to tables of pre-calculated expectation and summary reports are generated (Figure 6).

http://www.matthew-scott.com/prj/ISQED2003A5FLAT_files/image005.gif

Figure. 3  RegMan Interface

The tool is written in Perl/Tk and consists of 11K+ lines of code.  The interface (Figure 3) enables the user to configure the environment, choose processes, choose various stages (LVS, DRC, LPE, SIM etc) to run on the cells list, choose subsets of cells to run, view the setup files for any cell, and view log files and results for any cell.  The list of basic options and actions available exceeds 160 and is beyond the scope of this report.  Some of the side- advantages to RegMan include the creation of a snapshot of the environment for each cell run. This snapshot enables detection of changes in the environment which may render previous runs indeterminate.  It has the side-advantage of enabling users to detect if the design-kit, schematic, layout, models or verification rules have changed.

4. Front-End Device Models and Simulators

The accuracy of all tools and libraries are never any better than the accuracy of the device models.  It has been found that rigorous measurement and characterization of the test chip devices, and cursory simulation tests are insufficient to guarantee accurate simulation.   Tests in this project have revealed errors in the model parameters, in the netlists generated by the symbol and Pcell libraries, and in the simulators themselves.  This section delves into the process of validating the models and simulators, and their consistency in more advanced techniques such as Monte-Carlo and Corners simulation.  These tests are provided by the Modeling side, and are automated by RegMan through parameterized calls to a generalize Ocean script [4]. The Ocean script has built-in evaluators for each type of test, and thus is highly extensible to real design analysis.

The device models/simulation tests consist of the following four classes.

Measurement vs. Simulation

      Given a database of measured parameters, such as the models themselves, simulations are run on isolated devices and results compared to measured. The test fixtures include: DC I-V checks, capacitance over frequency, MOSFET capacitance checks, MOSFET transconductance checks, 1/f noise checks.  Each test is repeated in each stage. Some tests are elaborated below. Figure 3 depicts the BJT I-V and MOSFET Cbg checks.

Schematic vs. Simulation

This type of quality assessment mostly applies to Resistors and Capacitors. For resistors a DC voltage source is applied across the resistor and the resulting simulated DC current is recorded. For capacitors, an AC voltage source of 1mv (rms), denoted  vac , is applied across the capacitor and the resulting simulated AC current, denoted Iac  is recorded and used to calculated the Capacitance with the equation :              C  =  ( Iac / vac ) / (2p f).  In the current-voltage checks for Diodes, BJTs, and MOSFETs we merely bias the devices at typical design points and simulate what the currents are through the diode, the collector, or the drain terminals (or other terminals as desired). These values are then recorded and self-consistency checks are performed such as Simulator vs. Simulator, Measured vs. Simulated, etc.

SimulatorX vs. SimulatorY

      In  Figure 5, the results of a SimulatorX-Vs.-SimulatorY run of equivalent models is depicted. Some differences were expected, the others were quickly identified and remedied.  The Sim-vs.-Sim also uses an Ocean script to compare signals.  A simple rms(v(s1) – v(s2)) is sufficient to detect differences in behavior.

       
 

http://www.matthew-scott.com/prj/ISQED2003A5FLAT_files/image008.jpg

 

http://www.matthew-scott.com/prj/ISQED2003A5FLAT_files/image009.jpg

     


Corners min, max vs. nominal, Monte-Carlo

      Corners are checked against the nominal to make sure the low corner is less in value than the nominal and the high corner is greater in value than the nominal. Also low and high corners can be checked against the m + 3s values calculated from Monte-Carlo simulation runs. This can be done for DC current-voltage, AC capacitance-voltage, or any other type of device output characteristic including transient responses.

Figure 4.   BJT I-V Check         MOSFET Cbg Check

http://www.matthew-scott.com/prj/ISQED2003A5FLAT_files/image011.gif

Figure 5.  Two simulators results on equivalent circuits identified small differences in rise/fall time of MOSFETS, significant differences in some passive device simulations.

5. Physical Verification Tools Validation

Physical verification (PV) includes the Design Rules Checks (DRC) , Layout Vs. Schematics (LVS), and the follow on Layout Parasitics Extraction (LPE).  The PV process is highly prone to error due to the creative process of transfer of a design from schematic to layout. 

The DRC rules are generated from a long list of historical process control tests and the post-fabrication results in terms of yield impact and reliability.  A typical DRC rule deck may contain hundreds of rules, implemented in between 4000 and 6000 lines of code.  The rules are made to be as general as possible to avoid false errors and allow for layout creativity where warranted.  Given that most rules apply to many permutations of geometry orientations and relations, the number of checks needed to test just the minimum and maximum boundary conditions of rules in intractable.  In this respect, it becomes imperative to automate the process of rule-deck validation and accelerate through distributed processing.  One can get by with the traditional pass/fail quilt, but the nature of the layers creation through Boolean operations within the rule decks leaves too much room for error whenever a rule is modified or added.  DRC regression tests have been automated for years now and have proven successful in QA’ing rule decks.

LVS rules are developed to ensure equivalence of the netlists generated from the schematics and layouts.  This includes a graphs-matching of the netlists from the layout and schematic, the device types and device sizes.  Here there is quite a bit of leeway given to the actual construction of devices.  For example, a MOSFET with W=x and L=y may be generated single-fingered, multi-fingered, inter-digitated with another device and surrounded by dummy poly.  There are quite a few esoteric practices allowed in the translation from schematics to layout, including use of parasitic devices in the schematics, multiple-potential substrate regions for analog and digital sections, and smashing of parallel devices in either the schematic or layout. The lists of conceptual tests already created exceeds 200. The number of permutations of device constructions is in the hundreds. The list of Token/strings pairs by which RegMan evaluates the reports is about 100.  Frequent changes to design kit contents are expected during new technology development. This tool can help ensure that these changes are represented and implemented consistently across the design kit.

6. Parasitics Extraction Tool Validation

To reduce the errors input from the physical design, Layout Parasitics Extraction (LPE) is employed to extract and back-annotate apparent final physical factors back into the design simulation – whereupon the design is re-simulated to determine if specs are still met. The effects induced by parasitics are principle to the analysis of  'Signal Integrity', or SI.  As SI is a facet of EDA which is coupled to all stages of design, it presents an added complexity of requiring high integration between tools and means of information sharing between various abstraction levels to allow for convergence and global optimality.  Thus, the solution provided for parasitics extraction and signal integrity must simultaneously be:

  • Accurate: The analysis must be guarantee timing and SI simulations represent the real product
  • Robust: The solution must be able to accommodate varying needs, tools
  • Feasible: It must be simulatable within reasonable time and compute resources.
  • Usable: The tools and methodologies must be integrated and accessible to the infrequent user.

The RegMan system facilitates the analysis of extraction tools to determine their accuracy.  Likewise, it validates the 'rule-set' provided to the tool which defines the process technology and layers to be extracted.  This EDA generated data is prone to error (notably, the author’s error).  Only through rigorous testing can we be certain that no error slipped through.  To validate an extraction tool, comparisons are made against the 3D modeling tool Raphael. The process consists of:

  • Define the techfile for Raphael and LPE tool
  • Raphael regresses over 11 primary topologies
    • Includes 35 permutations each on layers
    • Each layer permutes Width, Spacing, L
  • Generate equivalent layout structures with Skill
  • RegMan runs LVS and LPE over all structures
  • RegMan parses the Raphael capacitance database
  • RegMan parses the LPE extracted spice files
  • RegMan compares and analyzes the results

The final report (Appendix A) consists of the errors for each structure, averages for each primary layer and layer-pairs, and each class of W, S, L permutations.   Said report facilitates identification of possible aberrations, and provides for determination of std. Deviation and variance.

Also implemented in this system is the ability to compare against Chern's and Sakurai's equations generated by Raphael [5], or measured data.  Of course, measured is preferred, but not always available.

Another tactic for identifying errors is to run the regression over all the cells using all three corners of the process technology.  These corners (min, nom, max) are generated automatically from the spreadsheet process parameters.  The min corner is defined as: max W, min S, min ILD.  The max corner is the opposite.

7. Validation of Extraction Circuits

The LPE generated netlist needs to be validated against the original (or ideal) schematic netlist to guarantee the equivalence of the layed-out intentional devices to their originators in the schematic.  A schematic is created which contains instances of each of the model-symbol combinations, each being driven independently by some appropriate stimuli. A layout is then created equivalent to the schematic and then extracted with the LPE tool.  The extracted layout is paired down to include only the intentional devices represented by the schematic. This is done by forcing the extractor to ignore device internal parasitics such as MOSFET AD, AS, PD, PS, and by stripping out the interconnect parasitics.  The original schematic and layout are then simulated with the same stimuli driving each device.  The resultant signals are compared through a simple difference v(s1) - v(s2).

http://www.matthew-scott.com/prj/ISQED2003A5FLAT_files/image013.gif

Figure 6.  Extracted netlist vs. Ideal Schematic.

 Since the signals should be identical, we should see only flat-lines on the waveform tool.  Any fluctuation represents some difference in the device netlists between the original and the extracted.  The script takes: " device pairs : v=rms(v(s1) - v(s2)), and reports any signal with rms > .00001 in a file and waveform such that the RegMan tool can automate and evaluate the process. In (Figure 6) the left side represents MOSFETS, right side indicates Bipolar devices. The layout side still includes parasitics here, thus the significant differences.

8. Conclusion and Analysis

      The discussed systems and regression manager framework have already proven valuable to the development and validation of design kits.  Many, heretofore undetected, errors have been discovered and remedied.  Accordingly, the tool has accelerated the development process by making comprehensive tests feasible with each step of tool or library modification.  This in turn allows more aggressive kit development to implement more features with less trepidation of possible error introduction. Table 2 provides our subjective estimates on the benefits of this system.  Here, FTE implies Full-Time-Equivalent and ‘RegMan time’ implies the same staff commitment with RegMan employed.

Table 2.  Estimated QA time: Manual vs. RegMan.

Test Stage

FTE

Time

RegMan

Time

Estimate # Tests

Repeats Estimate

Mod / Sim

6 wks

30 min

1000+

12+

DRC

4 wks

12 hrs

1000+

20+

LVS

2 wks

2 hrs

200+

20+

LPE

4 mo.

24 hrs

4000+

5+

LPE / Sim

1 day

1 hrs

20+

5+

      Finally, A key factor in this system is the chaining, or overlap, of tests between tools and libraries.  This propagation of 'confidence' can also enable visualization of error propagation and determination of peak-error injection points.  Knowledge of peak and average errors improves design of tests for corners, MC and sensitivity.

  1. References

[1]  P. Chen, D. A. Kirkpatrick, and K. Keutzer, "Scripting for EDA Tools: A Case Study", ISQED, IEEE, 2001, pp. 87 - 93.

[2]  A. H. Farrahi, D. J. Hathaway, M. Wang and M. Srafzadeh, “Quality of EDA CAD Tools: Definitions, Metrics and Directions", ISQED, IEEE, 2001, pp. 395-405.

[3] Load Sharing Facility Users Guide, Platform Computing, http://www.platform.com//

[4] Ocean Users Guide, Cadence Design Systems, San Jose CA.

[5] Raphael Interconnect Analysis Program,  Reference Manual, Synopsys, July 2000

Appendix A: LPE vs. Raphael Results, Abbreviated

      The following table A.1 represents the summary output of LPE values vs. Raphael analysis.  These are the results from just one 'general structure', ARRAY_ABOVE_GP, which consists of three strips in an array above a grounded infinite (large) bottom plate (Figure A.1) .  Given 6 conducting layers, 2 lengths (100u, 1000u), 3 widths and 3 spacings, there will be 21*2*3*3=378 permutations of one layer type above another.  Not shown are the individual results from each permutation. The system implements 11 general structures. Given these varied permutations we should be able to isolate which unique 'topologies' incur the most error.

 

 

L=length

 

 

  

 

 

 

<-S ->

<-W->

<Cc >

 

 

   

 

 Cb

 

 

 

GND

 

 

 

 

 

         

Cb: C of center to ground plane

 

Cc: Coupling center to neighbor

   

Ctot= Cb+2Cc

       
                       

 


HEXAPOD ROBOT CONTROLLER BASE ON NEURAL PHASE-COUPLED LOOPS


Stiquito: A Platform for Artificial Intelligence

by Matthew C. Scott


ABSTRACT

This paper presents examples and opportunities for studying and applying the paradigms of Artificial Intelligence using the simple and inexpensive platform provided by the Stiquito robot.

  • The mechanics of articulated hexapod ambulation are inspected and a genetic algorithm is motivated for determining the optimal gaits that Stiquito may attain while minimizing energy consumption and maintaining balance.
  • An algorithm is presented for controlling the legs which allows for over {240 \choose 3} forward-propelling gaits. Its development on a CAD system and implementation on 
    an FPGA are illustrated. 
  • The algorithm is then presented as a neural network using chains of phase locked loops. Implementation using CMOS  VLSI technology is then investigated.
  • And in summary, some possible experiments are considered for producing interesting populational dynamics such as emergent cooperation and self-organization

The above ABSTRACT is quoted from the text of the chapter. The following pages are provided as supplemental information for the interested reader and daring experimentalist. As the crux of the paper is the demonstration of how chains of neural-network based phased locked loops can generate multiple stable gaits, 
not much work has been devoted to the Genetic Algorithm analysis or actual VLSI neural network. But these are 
still the author's ultimate goal and he will be thrilled to 
hear your inputs!



 

 

 

 

http://www.matthew-scott.com/prj/ch13/figs/fig3.gif

http://www.matthew-scott.com/prj/ch13/figs/fig4.gif

http://www.matthew-scott.com/prj/ch13/figs/fig6.gif

http://www.matthew-scott.com/prj/ch13/nn/nn9.jpg

A fully connected array.


GENERAL ELECTRIC FACTORY AUTOMATION


 

Project: Mobile Data Terminal Protocol Conversion System and Assembly Line Monitoring System

 

These systems required the interfacing of a PDP-1170, MicroVax II, black-box code operated switches, TLSI-4 multiplexors, com adaptors, line drivers, a RAMM-16 multiplexor, radio linked mobile data terminals, motion detectors and numerous laser scanners placed strategically along the assembly line.  Schematics and logical topology of the system follow:

 

ALMS_CONFIGURATION

 

Assembly_Line_Monitoring_Sys

 


GRADUATE VLSI DESIGN WORK EXAMPLE


http://www.matthew-scott.com/data/C622_VLSI_DESIGN.htm

 C622 VLSI DESIGN: BLACKJACK DEALER

Matt Scott & Vijay Baliga 

 

http://www.matthew-scott.com/data/C622_VLSI_DESIGN_files/image004.jpghttp://www.matthew-scott.com/data/C622_VLSI_DESIGN_files/image005.jpg

 


JLE: JAVA LAYOUT EDITOR (PERSONAL PROJECT)


 

 http://www.matthew-scott.com/prj/jle/JLE_Project.htm

 

JLE: JAVA LAYOUT EDITOR

Place and Route Layout Editor for Integrated Circuits

 

http://www.matthew-scott.com/prj/jle/JLE_Project_files/image002.jpg

Virtuoso* Layout Editor from Cadence

http://www.matthew-scott.com/prj/jle/JLE_Project_files/image004.gif

JLE, Version 0.1

 

II. UML Model for Layout Program

http://www.matthew-scott.com/prj/jle/JLE_Project_files/image006.jpg

 IV. STATE DIAGRAM for Layout Program

 http://www.matthew-scott.com/prj/jle/JLE_Project_files/image008.jpg

 


SCHEME-BASED GENETIC ALGORITHM FOR LOTKA-VOLTERA EXPERIMENTS


 http://www.matthew-scott.com/prj/scheme/ga.htm

 

A Genetic-Algorithm systerm for evolving emergent behaviors in Turing machines.

 

A Foxes & Rabbits Simulation

 

      This Scheme based program can be used to discover the dynamics of colonies of autonomous robots (Turing Machines) in

a closed but complex environment. The machines start out with a random behaviour and quickly evolve (via cross-over genetic reproduction) to optimize their energy function (fitness) on the environment.

     This implementation also demonstrates a Lotka-Voltera ‘predator-prey’ dynamic in populations.  (From Wikipedia)

 

\frac{dx}{dt} = x(\alpha - \beta y)

\frac{dy}{dt} = - y(\gamma - \delta  x)

where

y is the number of some predator (for example, wolves);

x is the number of its prey (for example, rabbits);

dy/dt and dx/dt represents the growth of the two populations against time;

t represents the time; and

α, β, γ and δ are parameters representing the interaction of the two species.

http://www.matthew-scott.com/prj/scheme/start.gif

Initial random placement of machines, plants, food.

http://www.matthew-scott.com/prj/scheme/end.gif

Half an hour later. The colonies are all migrating in groups.

 


PARALLE-C++ GENETIC ALGORITHM FOR QUADRATIC GRAPH MATCHING


http://www.matthew-scott.com/prj/ga/final.html

 

A Parallel Genetic Algorithm for solving the Quadratic Graph-Matching Problem

Review of the Quadractic Assignment Problem

The Quadratic Assignment Problem of order n is the search of a mapping of n activities to n locations. It was first described by (Koopmans and Beckman, 1957). Originally it was devised to solve the Job Shop Scheduling problem in which items at each station where transfered periodically to other stations, and the transfer had varying costs according to the item type. Thus, given a limited number of stations, assign the items to them in order to minimize the total average cost of tranfers.

D = [dih] = matrix of distances (location i to location h)....

F = [fjk] = matrix of flows (item j to item k) .....................

C = [cij] = matrix of assignment costs (item i to location j)

Now, simply find a permutation : i->(i) which is a particular assignment of item j = (i) to station i (i = 1,...,n), such that it minimizes the cost of the flow (D*F)*C. As the cost function C is easily applied, it is not presented in the equations.

min z = {i,h=1 -> n} dih*f(i)(h)

This is reformulated to a quadratic function by introducing a permutation matrix X of dimension n×n, such that each xij = 1 if item j is assigned to station i and 0 otherwise. Thus:

min z = {i,j=1 -> n}http://www.matthew-scott.com/prj/ga/sum.gif {h,k=1 -> n}http://www.matthew-scott.com/prj/ga/sum.gif dih* fjk* xij* xhk

where the xij are such that

{i=1 -> n}http://www.matthew-scott.com/prj/ga/sum.gif( xij)<=1 (j = 1, ..., n) : No more than 1 item per station

{j=1 -> n}http://www.matthew-scott.com/prj/ga/sum.gif( xij)=1 (i = 1, ..., n) : Every item is assigned

each xij an element of (0,1)

Figure 3. QAP, D,F,C example

http://www.matthew-scott.com/prj/ga/final_files/image001.gifhttp://www.matthew-scott.com/prj/ga/final_files/image002.gif

RESULTS (EXAMPLE RUN)

http://www.matthew-scott.com/prj/ga/tbest.gif

 

 


TEAMWORK!  USS IOWA BB-61 WATCH SUPERVISOR. CAPTAIN OF FOOTBALL TEAM etc.


http://www.matthew-scott.com/data/High_School.htm

Football_Captain_1981

Captain of Lost River Raiders Football Team: ‘Nearly’ Undefeated

 

 

 

 

 

 

 

Matt_Soccer_Costa_Rica

Goodwill Soccer Matches:  Guatemala, Columbia, Venezuela, Costa Rica, Honduras, Panama, Denmark, Germany, France, Norway …


 


MATTHEW’S FAMILY-FOLK & TRAVELS



 

ONLINE RESUME.. 1

COVER LETTER ONLINE.. 4

RECOMMENDATIONS. 5

PERFORMANCE REVIEWS. 7

ACADEMIC CURRICULUM VITAE - EXPANDED.. 12

EDA/CAD TRADE COURSES - EXPANDED.. 14

PROFESSIONAL PAPER: PDK & PVM VALIDATION.. 16

HEXAPOD ROBOT CONTROLLER BASE ON NEURAL PHASE-COUPLED LOOPS. 26

GENERAL ELECTRIC FACTORY AUTOMATION.. 32

GRADUATE VLSI DESIGN WORK EXAMPLE.. 34

JLE: JAVA LAYOUT EDITOR (PERSONAL PROJECT) 36

SCHEME-BASED GENETIC ALGORITHM FOR LOTKA-VOLTERA EXPERIMENTS. 38

PARALLE-C++ GENETIC ALGORITHM FOR QUADRATIC GRAPH MATCHING.. 40

. 53

 

Last Updated ( Thursday, 17 November 2011 05:39 )