Testing & Validation Strategy¶
Note
Critical Reference: Doc 45 Parity Test Plan. This document (doc 06) defines the testing infrastructure and methodology — HOW we test. The 45-tb11-parity-test-plan defines the acceptance gate — WHAT must pass and to what threshold. Doc 45 specifies 92 tests across 7 parity gates (G1–G7) with a ≥95% pass rate on 76 testable items required to close M2. All P0 step results feed into the doc 45 scorecard; doc 06 validation infrastructure is the execution engine for doc 45 test phases.
Last Updated: April 15, 2026 — P0 incremental baseline at 40/56 (71%). G5 DXF parity FAIL: 97/350 entities (28%). Prior “447/447 PASS” was invalid (golden exported from mangled drawing). Annotation/dimensioning pipeline is the primary gap. Steps 11–12 and graduation not started. P-series sub-dialog capture plan documented but not yet started.
Note: This plan is a working draft. Some technical details (exact file provenance, build parity, vendor timelines, pricing, and Autodesk product capabilities) may need verification as we inventory the full legacy tree and validate behavior in modern environments.
Date Pair Convention¶
All validation records must document two distinct dates per test run: the date of the first pass and the date of the review/confirmation. This dual-column structure enables regression tracking and prevents ambiguity about when evidence was generated versus when it was validated.
Format and Examples¶
First Pass |
Reviewed |
Status |
Notes |
|---|---|---|---|
2026-03-13 |
2026-03-15 |
✅ PASS |
OCR verified at 96% character match vs golden (VM 102) |
2026-03-14 |
PENDING |
🟡 REGRESSION |
Previously passed on 2026-03-13; failure reason documented in bug #93 |
2026-03-15 |
2026-03-16 |
✅ PASS |
Re-validated after bug fix; OCR at 97% match |
Retroactive Interpretation¶
Legacy entries with a single date (e.g., “PASS (2026-03-10)”) are interpreted as: First Pass = 2026-03-10, Reviewed = same date. This maintains backward compatibility while enabling the date-pairing structure going forward.
Status Labels¶
✅ PASS: Test passed on first pass and confirmed on review date. OCR or deterministic verification at ≥95% accuracy.
❌ FAIL: Test did not pass on first pass date or failed on review. Evidence preserved in reports/ocr-output/
/. 🔄 REGRESSION: Test passed on first pass date but failed on a later review or subsequent run. Non-destructive evidence from both runs preserved; delta documented in bug tracker.
🟡 PARTIAL: Test passed some gates (structural/semantic) but failed others (value/tolerance). Applicable gates listed in notes; full investigation in bug tracker.
⏳ REVIEW PENDING: Evidence captured; awaiting OCR or deterministic verification against golden baseline.
Evidence Archival¶
All First Pass evidence and all Reviewed evidence must be stored separately by run ID:
reports/ocr-output/<first-pass-run-id>/— original first pass screenshots and OCR outputreports/ocr-output/<review-run-id>/— review/regression run screenshots and OCR outputWhen a REGRESSION occurs, both run directories remain in reports/ and are cross-linked in the bug tracker entry
Test Infrastructure (Updated Feb 28, 2026)¶
Note
Workspace requirement: QA validation, OCR comparison, and all scripts referenced in this document require the WS1 Full workspace profile. The scripts/, reports/, and src/ directories are not available in WS2 (Modified) or WS3 (Admin) sparse checkouts. See docs-developer/dev-workflow.md for workspace profile definitions.
Proxmox VM Environment 🆕¶
ConstructiVision tower computer converted to Proxmox VM — This is now our primary legacy testing environment.
Component |
Details |
|---|---|
Host |
Proxmox VE 8.x (dedicated server) |
Backup |
Automated ZFS snapshots (can clone/revert) |
Remote Access |
Tailscale VPN + SSH |
Patching |
LegacyUpdate.net for XP/Vista updates |
Windows Test VMs (Updated Feb 28, 2026)¶
VM ID |
Name |
Purpose |
Status |
|---|---|---|---|
102 |
XP-CV-V11 |
CV v11.0 reference — cloned from 103, v11.0 installed. AutoIT validation target. |
Running |
103 |
XP-LEGACY |
MASTER COPY — original 2008 GSCI workstation. v7.0 production. DO NOT MODIFY. |
Running |
104 |
XP-TEST |
CV v7.0 patch — Fresh XP SP3 for testing. Deferred VLX loading deployed (Bug 20). AutoIT 6/6 pass. |
Running |
105 |
Install-Dev |
InstallShield IDE — IS 5.0/6.10 compiler environment (cloned from VM 104). |
Stopped |
106 |
Vista-TEST |
Windows Vista test environment. Security-hardened (SMBv1 disabled, DEP AlwaysOn). |
Stopped |
107 |
Win7-TEST |
Windows 7 test environment. Security-hardened (SMBv1 disabled, DEP AlwaysOn). |
Stopped |
108 |
Win10x32-TEST |
Win10 32-bit — CV auto-loads with config script. Junction-swap deployment. |
Running |
109 |
Win10x64-TEST |
Win10 64-bit CAD Workstation — AC2000 + CV working after Wow6432Node COM fix. |
Running |
201 |
Alpha-TEST1 |
Alpha Testing (Tai) — GSCI engineer, PB11-00x32 deployed via |
Running |
202 |
Alpha-TEST2 |
Alpha Testing (Dat) — GSCI engineer, PB11-00x32 deployed via |
Running |
Note
IP addresses, SSH configuration, and network topology have been moved to internal infrastructure documentation for security. See docs-sensitive/vm-infrastructure/ (requires GitHub login).
Alpha Testing VMs (201-202): GSCI engineers (Tai and Dat) access via SSH + RDP. Deployed with scripts/Deploy-AlphaVM.ps1 which switches sparse-checkout to PB11-00x32, updates junction, and creates desktop shortcuts. See alpha-testing-gsci-plan (sensitive).
VM Security Hardening (Applied Jan 29, 2026)¶
Vista-TEST (106) and Win7-TEST (107) have been hardened with the following security measures:
Security Measure |
Vista |
Win7 |
Notes |
|---|---|---|---|
Windows Firewall (all profiles) |
✅ ON |
✅ ON |
Domain, Private, Public all enabled |
Windows Defender |
✅ Running |
✅ Running |
Built-in AV protection |
Panda Antivirus |
✅ Installed |
✅ Installed |
Additional AV layer |
SMBv1 Disabled |
✅ |
✅ |
EternalBlue/WannaCry protection |
Remote Registry Disabled |
✅ |
✅ |
Prevents remote registry access |
SSDP Discovery Disabled |
✅ |
✅ |
Reduces attack surface |
UPnP Host Disabled |
✅ |
✅ |
Prevents UPnP exploits |
Autorun/Autoplay Disabled |
✅ |
✅ |
Malware vector mitigation |
DEP (AlwaysOn) |
✅ |
✅ |
Data Execution Prevention for all programs |
Guest Account Disabled |
✅ |
✅ |
No anonymous access |
WMP Network Sharing |
N/A |
✅ Disabled |
Closed ports 554/10243 |
Registry Changes Applied:
# SMBv1 Disabled (EternalBlue protection)
HKLM\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameters\SMB1 = 0
# Autorun/Autoplay Disabled
HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\Explorer\NoDriveTypeAutoRun = 255
HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\Explorer\NoAutorun = 1
Services Disabled:
RemoteRegistry— Remote RegistrySSDPSRV— SSDP Discoveryupnphost— UPnP Device HostWMPNetworkSvc— Windows Media Player Network Sharing (Win7 only)
DEP Configuration:
bcdedit /set nx AlwaysOn
LegacyUpdate.net Patches Applied¶
Critical security patches installed via LegacyUpdate.net on Vista and Win7 VMs:
Windows Vista (VM 106):
KB Number |
Description |
CVE |
|---|---|---|
KB4012598 |
MS17-010 SMB Remote Code Execution (WannaCry) |
CVE-2017-0143 |
KB4493730 |
SHA-2 Code Signing Support |
— |
KB4474419 |
SHA-2 Update for Secure Boot |
— |
KB4054518 |
.NET Framework 4.7.1 Security Update |
Multiple |
KB3205401 |
Internet Explorer 9 Cumulative Update |
Multiple |
Windows 7 (VM 107):
KB Number |
Description |
CVE |
|---|---|---|
KB4012212 |
MS17-010 SMB Remote Code Execution (WannaCry) |
CVE-2017-0143 |
KB4490628 |
SHA-2 Servicing Stack Update |
— |
KB4474419 |
SHA-2 Code Signing Support |
— |
KB4536952 |
Extended Security Update (ESU) Licensing Prep |
— |
KB5017361 |
ESU September 2022 Security Rollup |
Multiple |
KB5022338 |
ESU January 2023 Final Rollup |
Multiple |
LegacyUpdate Installation:
Download from https://legacyupdate.net
Run installer (auto-detects OS version)
Updates Windows Update agent and certificates
Enables access to Microsoft Update Catalog
Apply all critical/important updates
VM Software Verification (Updated Jan 30, 2026)¶
Remote verification of installed software via SSH — Use these commands to verify software installation across all VMs without manual inspection.
Method 1: Program Files Directory Listing (Recommended)
Lists installed programs by checking Program Files folders:
# Check single VM
ssh vista-106 "dir /b C:\PROGRA~1"
# Check all VMs in sequence
@("xp-103", "xp-104", "vista-106", "win7-107") | ForEach-Object {
Write-Host "`n=== $_ ===" -ForegroundColor Cyan
ssh $_ "dir /b C:\PROGRA~1" 2>$null
}
Method 2: WMIC Product Query (Vista/Win7 only)
Lists MSI-installed products (slower, may miss some software):
ssh vista-106 "wmic product get name"
ssh win7-107 "wmic product get name"
⚠️ Note: WMIC may not work on XP and doesn’t show all installed software (only MSI-based installs).
Standardized Testing Stack (Updated Feb 28, 2026):
XP VMs (102–104) run the full testing toolset including AutoIT for automated menu validation. Vista/Win7 VMs have monitoring tools only.
Tool |
Version |
Purpose |
XP-102 |
XP-103 |
XP-104 |
Vista-106 |
Win7-107 |
Win10x32-108 |
Win10x64-109 |
|---|---|---|---|---|---|---|---|---|---|
Total Uninstall |
6.18.0 |
Install monitoring & clean uninstall |
✅ |
✅ |
✅ |
✅ |
✅ |
— |
— |
Regshot |
1.9.0 |
Registry/file snapshot comparison |
✅ |
✅ |
✅ |
✅ |
✅ |
— |
— |
InCtrl5 |
5.0 |
Before/after install snapshots |
✅ |
✅ |
✅ |
✅ |
✅ |
— |
— |
AutoIt3 |
3.3.16 |
UI automation & menu validation |
✅ |
✅ |
✅ |
— |
— |
✅ |
✅ |
Supporting Software:
Tool |
Purpose |
XP-102 |
XP-103 |
XP-104 |
Vista-106 |
Win7-107 |
Win10x32-108 |
Win10x64-109 |
|---|---|---|---|---|---|---|---|---|
Supermium |
Modern browser for legacy OS |
✅ |
✅ |
❌ |
✅ |
✅ |
— |
— |
Panda AV 2010 |
Antivirus protection |
✅ |
✅ |
✅ |
✅ |
✅ |
— |
— |
Bitvise SSH |
Remote command execution |
✅ |
✅ |
✅ |
✅ |
✅ |
||
Legacy Update |
Windows Update access |
✅ |
✅ |
✅ |
✅ |
✅ |
Total Uninstall Version Selection¶
We evaluated multiple versions to find the newest XP-compatible release:
Version |
File Size |
PE Header |
XP Compatible |
Notes |
|---|---|---|---|---|
5.9.0 |
6.4 MB |
Windows 5.00 |
✅ |
Oldest tested |
6.0.0 |
18.9 MB |
Windows 5.00 |
✅ |
— |
6.18.0 |
26.7 MB |
Windows 5.00 |
✅ |
Best choice — newest XP-compatible |
7.6.2 |
15.2 MB |
Windows 6.00 |
❌ |
Vista+ only |
Winner: Total Uninstall 6.18.0 — The most feature-rich version that still supports Windows XP (PE32 for Windows 5.00).
Tool Locations:
Location |
Contents |
|---|---|
|
Primary install monitoring tool |
|
Portable registry/file snapshot tool |
|
Install monitoring tool |
|
UI automation scripting |
|
SSH server for remote access |
Desktop Shortcuts: All 4 tools have shortcuts on Administrator desktop across all VMs:
Total Uninstall 6.lnk
Nightly Build Deployment (Win10 & Alpha VMs)¶
The Windows 10 test VMs and Alpha testing VMs automatically pull the latest ConstructiVision build from GitHub each night using a scheduled task.
Task Name: ConstructiVision Git Pull
Setting |
Value |
|---|---|
Schedule |
Daily at 22:00 |
Run As |
Administrator (Highest privilege) |
Action |
|
VMs |
108 (Win10x32), 109 (Win10x64), 201 (Alpha-Tai), 202 (Alpha-Dat) |
How it works:
Sparse checkout — Each VM clones the repo with
--filter=blob:none --sparse. Sparse-checkout paths depend on architecture:x32 VMs (108, 201, 202):
src/x32/+src/Project Files/x64 VMs (109):
src/x64/+src/Project Files/
Junction-swap deployment —
CV Update.batmanages a directory junction from Program Files into the sparse checkout. On test VMs with both TB and PB builds, it provides an interactive picker to switch between them:C:\Program Files\ConstructiVision → C:\Repos\Constructivision\src\x32\TB11-01x32 or C:\Repos\Constructivision\src\x32\PB11-00x32On alpha VMs with only PB, it runs in pull-only mode (no picker).
Self-update protection —
CV Update.batcopies itself to%TEMP%before runninggit pull, preventing CMD from reading garbage if the batch file itself is modified by the pull.Nightly pull — The scheduled task runs
CV Update.bat /pullat 22:00, downloading only changed blobs.SSH deploy keys — Each VM has its own read-only SSH deploy key (ed25519, no passphrase) registered on the GitHub repo.
Setup script: Each VM has a reference script at C:\Users\Administrator\Desktop\clone-cv.txt containing the key generation, clone, and junction commands (renamed from .cmd to .txt to prevent accidental execution).
Multi-OS Installer Testing (Week 4-5: Feb 10-24) — COMPLETED¶
Test Objectives (Revised Sprint Plan)¶
Context: Week 3 Nero analysis validated wizard approach. Week 4 (Feb 10) achieved breakthrough: AC2000 + CV3.60 installed on all 3 VMs (XP/Vista/Win7) with no UAC issues. Validation bugs are tracked in the live register (see Bug Tracker — Validation Campaign). 41 files identical across all platforms.
Decision Point (Feb 28) — COMPLETED:
✅ PASSED (Feb 10): Installer works on XP/Vista/Win7 as Administrator
✅ PASSED (Feb 17): Win10 32-bit (VM 108) — auto-configured via
Configure-ConstructiVision.ps1✅ PASSED (Feb 17): Win10 64-bit (VM 109) — working after Wow6432Node COM registry fix
✅ Source-mode testing active: TB11 loads .lsp directly, enabling rapid iteration
⏳ Next: NanoCAD compatibility testing (free), then latest AutoCAD (purchase required)
Multi-OS Testing Matrix¶
OS |
VM |
Priority |
Status |
Results (Feb 10) |
|---|---|---|---|---|
Windows XP SP3 |
VM 104 |
Baseline |
✅ PASSED |
AC2000 + CV3.60 installed, 41 files deployed, 8 bugs documented |
Windows Vista SP2 |
VM 106 |
P2 |
✅ PASSED |
No UAC issues (logged in as Administrator), identical file deployment |
Windows 7 SP1 |
VM 107 |
P2 |
✅ PASSED |
No UAC issues (logged in as Administrator), identical file deployment |
Windows 10 32-bit |
VM 108 |
P1 |
✅ PASSED |
AC2000 + CV11 working. Auto-configured via |
Windows 10 64-bit |
VM 109 |
P1 |
✅ PASSED |
Working after Wow6432Node COM registry fix (103 CLSIDs, 104 Interfaces, 6 ProgIDs) |
Key Finding (Feb 10): All 3 tested platforms produce identical file deployments (41 files). Bugs are consistent across all OSes. The installer is more compatible than initially feared — the “95% failure rate” was based on non-Administrator testing.
Key Finding (Feb 17–28): Win10 32-bit (VM 108) and 64-bit (VM 109) both fully working with ConstructiVision v11 in source-mode (TB11). Auto-configuration handled by scripts/Configure-ConstructiVision.ps1. Win10x64 required Wow6432Node COM registry fix. Total bugs across all platforms: 21 (see Bug Tracker — Validation Campaign).
Feb 10 Multi-OS Testing Results 🆕¶
Monitoring Tools Used: Total Uninstall 6.18, Inctrl5 1.0.0.0, Regshot 1.9.0, AutoIT 3.3.16 (menu validation automation on XP VMs)
Test Results Summary:
Metric |
XP SP3 (VM 104) |
Vista SP2 (VM 106) |
Win7 SP1 (VM 107) |
|---|---|---|---|
AC2000 Install |
✅ |
✅ |
✅ |
CV3.60 Install |
✅ |
✅ |
✅ |
Files Deployed |
41 |
41 |
41 |
UAC Issues |
N/A |
✅ None (Admin) |
✅ None (Admin) |
Bugs Found |
8 |
8 (same) |
8 (same) |
acad.exe Crash |
No |
Yes (Bug 6) |
Yes (Bug 6) |
Monitoring Data: test-results/{XP,Vista,Win7}-Monitoring/CV360/ (9 files)
Full Analysis: CV360-Installation-Analysis.md (470 lines)
Snapshots Taken:
ac2000-installed-20260210— AC2000 only, all 3 VMscv360-installed-20260210— AC2000 + CV3.60, all 3 VMs
Why testing is the modernization accelerator¶
Without a regression suite, every modernization step is a guess. With a regression suite, every step becomes: “did output match?”
Static Analysis Suite (March 2026) 🧪¶
Offline code-quality checks that run without AutoCAD. Designed as the safety net for clean-code refactoring (Passes 2–4) — catches regressions before VM testing.
Script: scripts/cv-static-analysis.py
Usage:
# Full suite (6 checks)
python scripts/cv-static-analysis.py
# Single check
python scripts/cv-static-analysis.py --check parens
# Generate regression snapshot (SHA-256 per file)
python scripts/cv-static-analysis.py --snapshot
# Compare two snapshots
python scripts/cv-static-analysis.py --diff reports/snapshot-A.json reports/snapshot-B.json
# Custom build directory
python scripts/cv-static-analysis.py --dir src/x64/TB11-01x64
Checks Performed¶
# |
Check |
What it catches |
Baseline (March 13, 2026) |
|---|---|---|---|
1 |
Paren balance |
Mismatched |
132 pass, 2 known pre-existing |
2 |
Load chain |
Modules in |
99 modules present, 29 orphaned (by design) |
3 |
Function catalog |
Duplicate |
184 defuns, 7 pre-existing duplicates |
4 |
Named constants |
|
58 constants, all defined + referenced |
5 |
Convention lint |
Tabs, |
Tabs/block-comments clean; 1,962 stacked parens (Pass 4 target) |
6 |
DCL brace balance |
Mismatched |
49 pass, 1 known pre-existing (ch_dlg.dcl) |
Known Pre-existing Issues (Accepted)¶
These are inherited from the original codebase and documented in the script’s configuration:
File |
Issue |
Source |
|---|---|---|
|
+3 paren imbalance |
Original v11 |
|
−2 paren imbalance |
Original v11 |
|
+5 brace imbalance |
Original v11 |
Regression Snapshots¶
The --snapshot command generates a JSON file with SHA-256 hashes, file sizes, and line counts for every .lsp and .dcl file. Use --diff to compare before/after refactoring:
# Before refactoring
python scripts/cv-static-analysis.py --snapshot
# ... refactor ...
python scripts/cv-static-analysis.py --snapshot
# Compare
python scripts/cv-static-analysis.py --diff reports/snapshot-BEFORE.json reports/snapshot-AFTER.json
Baseline snapshot: reports/snapshot-20260313-122315.json (184 files).
Early Win: Paren Regression Discovery¶
The suite immediately caught 2 paren regressions from Pass 1 annotation — (progn accidentally placed on ;;--- comment separator lines in fenable.lsp and wdenable.lsp, hiding opening parens from the AutoLISP parser. Fixed in commit 75d3dce1.
Source-Mode Testing (TB11)¶
TB11-01x32 is the active test build containing 126 .lsp files and 44 .dcl files loaded directly from source — not compiled into a VLX bundle. This enables a significantly faster test cycle compared to traditional VLX-based testing.
How Source-Mode Loading Works¶
acaddoc.lspruns per-document (every drawing opened) and loads the ConstructiVision menu viacsvmenu.lspcsvmenu.lsploads all.lspmodules from the application directory using(load)callsacad.lspprovidesS::STARTUPbackup: setsACADLSPASDOC=0soacad.lspruns once at startup, whileacaddoc.lsphandles per-document initializationStartup Suite disabled:
NumStartup=0in the AutoCAD profile prevents stale VLX from intercepting the load chain
Benefits for Testing¶
Instant iteration: Edit a
.lspfile → restart AutoCAD → changes take effect immediately. No VLX recompilation required.Targeted debugging: Can isolate individual modules by commenting out
(load)calls incsvmenu.lspProduction parity: PB11-00x32 (production baseline) uses the same source files, ensuring TB11 source-mode findings apply directly to the production build
Deferred VLX Compilation¶
VLX compilation is intentionally deferred until the source-mode validation cycle is complete. Once all menu items pass validation and the bug count stabilizes, the .lsp sources will be compiled into csv.vlx for distribution via Autodesk App Store. See Release and Distribution Plan (App Store First) for the distribution plan.
Cross-References¶
This document is part of the ConstructiVision modernization documentation suite:
Document |
Purpose |
|---|---|
Release strategy (App Store PRIMARY, WiX ABORTED) |
|
DFMEA failure mode analysis — every bug must link to a DFMEA entry |
|
Authoritative live bug tracker — current counts, status, and DFMEA links |
|
Tech support workflow → GitHub Issues integration |
|
SDLC process (DFSS-aligned) governing this testing strategy |
Test layers¶
Parity Test Plan Mapping (Doc 45)¶
The P0 incremental baseline (below) feeds directly into the 45-tb11-parity-test-plan acceptance scorecard:
Doc 06 Layer |
Doc 45 Gate |
Relationship |
|---|---|---|
P0 Steps 0–2 (launch, layout, CSV init) |
G7 Infrastructure |
Foundation — must pass before any other gate |
P0 Steps 3–6 (drawing open, CSV on drawings) |
G1 Menu Routing + G5 Drawing Ops |
Validates command dispatch and file operations |
P0 Steps 7–12 (individual menu commands) |
G1 Menu Routing + G2 Dialog Appearance |
Validates progcont routing and dialog rendering |
P-series sub-dialog capture |
G3 Field Functionality + G4 Data Persistence |
Validates field input, XRecord round-trip |
Graduation (full au3 suite) |
G8 Regression |
End-to-end sweep after all fixes applied |
P0 results are evidence inputs to the doc 45 scorecard. The doc 45 pass formula (≥95% of 76 testable items) aggregates results from all test layers.
Priority 0) Incremental Baseline Validation (March 2026 — ACTIVE)¶
Warning
This section exists because we kept regressing. Two days of Bug 30 debugging showed that jumping straight to full AutoIT validation (11 screenshots, 6 menu items, 2 drawings) makes it impossible to isolate which layer broke. Every failure looked the same: “Cannot find the specified drawing file.” The actual root causes ranged from junction paths, to csvmenu modal alerts, to filedia state, to acaddoc.lsp load order — but they all produced the same symptom at Phase 1.
The fix: test ONE thing at a time, confirm it across all 4 configurations, then add the next thing. If Step N fails on TB11 but passes on the other 3, the problem is isolated to TB11’s Step N delta — no guessing required.
Philosophy: Each step adds exactly ONE new capability on top of the previous step. A step passes only when all 4 configurations produce equivalent results. Never skip ahead — if Step 3 fails, fix it before attempting Step 4.
Configurations under test:
Config |
VM |
Build |
Load Method |
Description |
|---|---|---|---|---|
A |
103 |
v7.0 |
CSV.VLX (Startup Suite) |
MASTER COPY — golden reference, never modified |
B |
102 |
v11 PB11 |
CSV.VLX (Startup Suite) |
Golden v11 reference — rebuilt to match 103 |
C |
104 |
PB11-00x32 |
CSV.VLX (acaddoc.lsp) |
Production build on test VM — proves the OS/AutoCAD work |
D |
104 |
TB11-01x32 |
csv.lsp (Startup Suite) |
Source-mode test build — the one that keeps breaking |
E |
108 |
TB11-01x32 |
csv.lsp (acaddoc.lsp) |
Win10 x32 — TB11 source-mode only (PB11/VLX abandoned — Bug 37) |
F |
109 |
TB11-01x64 |
csv.lsp (acaddoc.lsp) |
Win10 x64 — AC2000 via WoW64, double junction |
G |
109 |
TB11-01x64 |
csv.lsp (acaddoc.lsp) |
Win10 x64 — AC2026 native, CUI-based menu loading |
Switching between C and D on VM 104: Use CV Update.bat to toggle the C:\Program Files\ConstructiVision junction between PB11-00x32 (Config C) and TB11-01x32 (Config D). Restart AutoCAD after switching.
VM 109 double junction (x64): On Win10 x64, WoW64 redirects 32-bit apps between C:\Program Files and C:\Program Files (x86). Both paths have junctions to src\x64\TB11-01x64. See CV Update.bat commit 3e88b9d for the automated double-junction logic.
VM 109 dual AutoCAD (Configs F + G): VM 109 runs both AutoCAD 2000 (32-bit, via WoW64) and AutoCAD 2026 (64-bit, native). Each uses separate au3 scripts: scripts/acad2000/cv-p0-step*.au3 (Config F) and scripts/acad2026/cv-p0-step*.au3 (Config G). On VM 109, scripts are stored at C:\CV-Validation\scripts\acad2000\ and C:\CV-Validation\scripts\acad2026\.
Comparison method: Each step captures a screenshot or measurable output. Compare across all 7 configs. The test passes when Configs D/E/F/G match A/B/C within tolerance.
OCR requirement: Every screenshot comparison MUST follow the Mandatory OCR Validation Procedure (see Debugging Philosophy section). No step is marked PASS without OCR evidence at ≥95% character match against the golden baseline.
Step 0: AutoCAD launches and closes cleanly¶
What this tests: AutoCAD 2000 starts without crash, no stale dialogs, no hung processes.
Procedure:
Kill any existing
acad.exe:taskkill /F /IM acad.exe 2>NULDelete crash dump:
del "C:\Program Files\ACAD2000\acadstk.dmp" 2>NULLaunch AutoCAD:
start "" "C:\Program Files\ACAD2000\acad.exe"Wait 15 seconds
Verify process running:
tasklist /FI "IMAGENAME eq acad.exe"Verify no crash dump:
dir "C:\Program Files\ACAD2000\acadstk.dmp" 2>NULClose AutoCAD:
taskkill /F /IM acad.exe
Pass criteria:
acad.exeappears in task listNo
acadstk.dmpcreatedProcess terminates cleanly
Config |
Status |
Notes |
|---|---|---|
A (103 v7.0) |
✅ PASS (Mar 3) |
2256KB, clean launch, no crash dump |
B (102 v11) |
✅ PASS (Mar 3) |
2256KB, clean launch, stale crash dump cleaned |
C (104 PB11) |
✅ PASS (Mar 3) |
2256KB, clean launch |
D (104 TB11) |
✅ PASS (Mar 3) |
2256KB, clean launch, csv_silent_load suppressed alert |
E (108 TB11) |
✅ PASS (Mar 4) |
Steps 0–1 passed during PB11 run. Step 0 re-verified after CV Update switch to TB11 |
F (109 AC2000) |
✅ PASS (Mar 4) |
Steps 0–1 passed during Step 2 run (20260304-133757). Clean launch. |
G (109 AC2026) |
✅ PASS (Mar 6) |
AC2026 native launch. Clean startup, no crash dump. |
Step 1: Standardize AutoCAD window layout¶
What this tests: AutoCAD opens with a consistent, standardized window layout across all VMs. This step sends LISP (setenv) commands after AutoCAD launches to apply classic profile settings at runtime — no registry writes needed.
Note
Registry approach abandoned (March 3, 2026). Previous versions of this script wrote golden layout values to the registry before launching AutoCAD. This proved unreliable: VM 102 worked (already had correct values), but VM 103’s active profile disappeared after AutoCAD exit, and VM 104’s DockWindow.Position was overwritten from our value back to a 3-row default. AutoCAD 2000’s registry behavior is inconsistent across VMs. The (setenv) approach is runtime, deterministic, and doesn’t fight with AutoCAD’s profile system.
Classic profile settings applied (via LISP setenv at runtime):
Environment Variable |
Value |
Effect |
|---|---|---|
|
|
6 visible command-line rows |
|
|
2000 lines of command history |
|
|
R12-style accelerator keys (Ctrl+C = Cancel) |
|
|
Full-screen crosshairs |
|
|
Black drawing area background |
|
|
Tooltips enabled |
|
|
Scrollbars enabled |
|
|
Maximum array items |
See AutoCAD 2000 Environment Variables (setenv / getenv) Reference for the complete environment variable reference.
Tip
Troubleshooting: If setenv values don’t stick, check (getvar "CPROFILE") for the active profile name, then manually edit HKCU\Software\Autodesk\AutoCAD\R15.0\ACAD-1:409\Profiles\<YourProfile>\Display\<VariableName> in the registry.
Procedure:
Kill any running
acad.exe, delete stale crash dump, suppress Panda notificationsLaunch AutoCAD
Dismiss Startup dialog via Cancel button (ControlClick, not ESC)
Dismiss CV alert via OK button if present (ControlClick, not ESC)
Wait 5 seconds for layout to settle, force maximize if needed
Send LISP
(foreach pair ... (setenv ...))to apply all 8 classic settingsCapture full-window screenshot (
p0-01-layout-window.bmp)Send LISP
(foreach v ... (princ (getenv v)))to verify all 8 settings, press F2 to open Text Window, capture scrollback (p0-02-envall-dump.bmp), close Text WindowClose AutoCAD cleanly via File > Exit
Pass criteria:
Window is maximized (fills screen)
Command line is docked at bottom, 6 rows visible
Menu bar shows: File Edit View Insert Format Tools Draw Dimension Modify Express Window Help ConstructiVision
No floating/undocked panels
No error dialogs visible
Screenshot size > 2MB (maximized window with toolbars), F2 Text Window ~750KB
OCR character match ≥95% against VM 102 golden
F2 Text Window shows all 8 getenv values matching expected (verified by OCR on
p0-02-envall-dump.bmp)
Config |
Status |
Screenshot |
Notes |
|---|---|---|---|
A (103 v7.0) |
✅ PASS (Mar 3) |
p0-01, p0-02 (F2 Text Window) |
8/8 settings verified by OCR |
B (102 v11) |
✅ PASS (Mar 3) |
p0-01, p0-02 (F2 Text Window) |
Golden reference. 8/8 settings verified by OCR |
C (104 PB11) |
✅ PASS (Mar 3) |
p0-01, p0-02 (F2 Text Window) |
8/8 settings verified by OCR |
D (104 TB11) |
✅ PASS (Mar 3) |
p0-01, p0-02 (F2 Text Window) |
8/8 settings verified by OCR |
E (108 TB11) |
✅ PASS (Mar 4) |
p0-01, p0-02 |
Passed during PB11 run (Step 1 is build-agnostic) |
F (109 AC2000) |
✅ PASS (Mar 4) |
p0-01, p0-02 |
Passed during Step 2 run. 8/8 settings applied. |
G (109 AC2026) |
✅ PASS (Mar 6) |
p0-01, p0-02 |
AC2026 layout. 8/8 settings applied via setenv. |
Based on OCR comparison against golden (VM 102):
All 4 configs produce identical getenv output in the F2 Text Window:
CmdVisLines=6
CmdHistLines=2000
AcadClassic=1
CursorSize=100
Background=0
ToolTips=1
Scrollbars=1
MaxArray=999999
No instant-fail indicators (no “Cannot find”, “error”, “Function cancelled”, etc.)
Layout screenshots: 2,256 KB each (all 4 identical size — maximized window)
F2 Text Window screenshots: 750 KB each (all 4 identical size)
Validated March 3, 2026 — commits
c733ae3(buffer fix),0bbdea2(F2 capture),275e594(CV Update fix)
Step 2: ConstructiVision Initialization Verification¶
What this tests: The full CV initialization chain — not just menu bar presence, but actual csv command execution. The csv command is the anchor point for the entire CV module system. If csv runs and produces a dialog, everything upstream worked: acaddoc.lsp → csvmenu.lsp → csv.mnu → csv.lsp/CSV.VLX.
Also verifies configuration: Support File Search Path (ACAD), Project Name (PROJECTNAME), Menu Name (MENUNAME), Current Profile (CPROFILE), loaded menu groups (menugroup), and Startup Suite entries (registry).
Script: scripts/cv-p0-step2.au3
Procedure:
Launch AutoCAD (Steps 0–1 passed)
Dismiss Startup dialog if present
Dismiss CV alert if present (record: alert = csvmenu.lsp loaded)
Wait for full initialization, maximize window
Capture full-window screenshot — menu bar check (
p0-03-menubar.bmp)Type
csv+ Enter — the actual initialization testWait 5 seconds for dialog/error, capture screenshot (
p0-04-csv-command.bmp)Dismiss csv dialog (Escape)
Send config dump LISP commands (each under 256 chars for AC2000 buffer):
(princ (strcat "\nACAD=" (getenv "ACAD")))— search path(princ (strcat "\nPROJECT=" (getvar "PROJECTNAME")))— project(princ (strcat "\nMENUNAME=" (getvar "MENUNAME")))— menu file(princ (strcat "\nPROFILE=" (getvar "CPROFILE")))— profileF2 screenshot →
p0-05-config-paths.bmp(setq i 0)(while(menugroup i)(princ(strcat "\nMENUGROUP=" (menugroup i)))(setq i(1+ i)))— menu groups (index-based; parameterlessmenugrouperrors on AC2000)F2 screenshot →
p0-06-config-menus.bmp(vl-load-com)thenvl-registry-readloop for Startup Suite entriesF2 screenshot →
p0-07-config-startup.bmp
Close AutoCAD via File > Exit
Output files:
File |
Content |
|---|---|
|
Full window — menu bar with ConstructiVision |
|
Full window — csv dialog or error visible |
|
F2 Text Window — ACAD, PROJECT, MENUNAME, PROFILE |
|
F2 Text Window — menu groups |
|
F2 Text Window — Startup Suite + END CONFIG |
|
Test execution log |
Pass criteria:
OCR of
p0-03-menubar.bmpcontains “ConstructiVision” in menu bar areaOCR of
p0-04-csv-command.bmpshows a dialog (NOT “Unknown command CSV”)OCR of
p0-05-config-paths.bmpshows:ACAD=with valid path(s) containingConstructiVisionMENUNAME=with path tocsvmenu
OCR of
p0-06-config-menus.bmpshows:MENUGROUP=entries includeCSV(orACAD,EXPRESS)
OCR of
p0-07-config-startup.bmpshows:STARTUP[0]=with csv.lsp or CSV.VLX path (orSTARTUP=<none>for v7.0)
No instant-fail indicators: “Unknown command”, “Cannot find”, “error”, “bad argument”
Character match ≥95% against golden baseline (VM 102/103)
Screenshots: menubar >2MB, config dump >500KB
Config |
Status |
Screenshots |
Notes |
|---|---|---|---|
A (103 v7.0) |
✅ PASS (Mar 4) |
p0-03, p0-04, p0-05 (3 screenshots) |
VLX loaded natively (real dir). No Express menu. |
B (102 v11) |
✅ PASS (Mar 4) |
p0-03 through p0-07 (6 screenshots) |
Golden reference. VLX loaded natively (real dir). “Unknown command Y”. |
C (104 PB11) |
✅ PASS (Mar 4) |
p0-03 through p0-07 (6 screenshots) |
VLX loaded via |
D (104 TB11) |
✅ PASS (Mar 4) |
p0-03 through p0-07 (6 screenshots) |
csv.lsp loaded via Startup Suite. Noise: |
E (108 TB11) |
✅ PASS (Mar 4) TB11 |
p0-03 through p0-07 (6 screenshots) |
PB11 FAIL (Bug 37 WinHelp). TB11 PASS — all screenshots clean, no errors. |
F (109 AC2000) |
✅ PASS (Mar 6) |
p0-03 through p0-07 (6 screenshots) |
AC2000 via WoW64. Menu bar, csv dialog, config dump all clean. |
G (109 AC2026) |
✅ PASS (Mar 6) |
p0-03 through p0-07 (6 screenshots) |
AC2026 native. CUI-based menu loading. Clean startup, csv dialog verified. |
Based on OCR comparison across all 4 configs (March 4, 2026):
Config A — VM 103 v7.0 (golden master, run 20260304-081445):
Screenshot |
Size |
OCR Key Content |
Instant-Fail? |
Verdict |
|---|---|---|---|---|
|
2,256 KB |
Menu bar: “ConstructiVision” (no Express menu — v7.0). Clean startup. |
No |
✅ |
|
2,256 KB |
Dialog: “ConstructiVision - Program Options”, “Version 11.00 2/4/2013”, “Create New Drawing”, “Batch Utilities” |
No |
✅ |
|
750 KB |
|
No |
✅ |
Note: VM 103 ran an earlier au3 version with only 3 screenshots and a combined config dump. The error: too few arguments is from the au3’s (menugroup) call without an index — a test infrastructure issue corrected in later au3 versions, not a CV bug.
Config B — VM 102 v11 (reference, run 20260304-085328):
Screenshot |
Size |
OCR Key Content |
Instant-Fail? |
Verdict |
|---|---|---|---|---|
|
2,256 KB |
Menu bar: “ConstructiVision”. Clean startup: “Regenerating model”, no errors |
No |
✅ |
|
2,256 KB |
Dialog: “ConstructiVision - Program Options”, “Version 11.00 2/4/2013”, “Create New Drawing”, “Batch Utilities” |
No |
✅ |
|
750 KB |
|
No (known) |
✅ |
|
750 KB |
|
No |
✅ |
|
750 KB |
|
No |
✅ |
|
750 KB |
|
No |
✅ |
Note: VM 102’s STARTUP=<none> is due to the au3 using an incorrect registry key path at the time (fixed in later versions for VM 104 runs). The Startup Suite IS configured on VM 102 — this is a test artifact.
Config C — VM 104 PB11 (run 20260304-112402, commit afa0bbe):
Screenshot |
Size |
OCR Key Content |
Instant-Fail? |
Verdict |
|---|---|---|---|---|
|
2,256 KB |
Menu bar: “ConstructiVision”. Clean startup: “Regenerating model”, no errors |
No |
✅ |
|
2,256 KB |
Dialog: “ConstructiVision - Program Options”, “Version 11.00 2/28/2026”, “Create New Drawing”, “Batch Utilities” |
No |
✅ |
|
750 KB |
|
No (known) |
✅ |
|
750 KB |
|
No |
✅ |
|
750 KB |
|
No |
✅ |
|
750 KB |
|
No |
✅ |
Config D — VM 104 TB11 (run 20260304-112903, commit afa0bbe):
Screenshot |
Size |
OCR Key Content |
Instant-Fail? |
Verdict |
|---|---|---|---|---|
|
2,256 KB |
Menu bar: “ConstructiVision”. Noise: |
See note |
⚠️ PASS |
|
2,256 KB |
Dialog: “ConstructiVision - Program Options”, full TB11 button set (View All Layers, Print All Layers, View/Print Selected Layers, Change 3D Viewpoint, Print Materials List, Print Revision History) |
No |
✅ |
|
750 KB |
|
No (noise only) |
✅ |
|
750 KB |
|
No |
✅ |
|
750 KB |
|
No |
✅ |
|
750 KB |
|
No |
✅ |
Cross-config comparison:
Feature |
A (103 v7.0) |
B (102 v11) |
C (104 PB11) |
D (104 TB11) |
|---|---|---|---|---|
Menu bar “ConstructiVision” |
✅ |
✅ |
✅ |
✅ |
csv dialog appears |
✅ |
✅ |
✅ |
✅ |
Dialog title |
“Program Options” |
“Program Options” |
“Program Options” |
“Program Options” |
Version string |
11.00 2/4/2013 |
11.00 2/4/2013 |
11.00 2/28/2026 |
Missing (Bug 39) |
Express Tools menu |
No |
Yes |
Yes |
Yes |
|
Not tested |
Yes (VLX) |
Yes (VLX) |
No (source) |
|
No |
No |
No |
Yes |
|
|
✅ |
✅ |
✅ |
Startup Suite |
|
|
|
|
|
Not tested |
No |
Yes (VLX loads it) |
No |
TB11 noise explanation: The error: LOAD failed message on TB11 comes from the Startup Suite (not acaddoc.lsp). The Startup Suite registry has CSV.VLX as an entry, and AutoCAD’s native Startup Suite loader cannot load VLX files through NTFS junctions. On TB11, csvmenu.lsp also loads via Startup Suite and succeeds, defining c:csv before acaddoc.lsp runs. When acaddoc.lsp checks (not c:csv), it finds c:csv already defined and skips its loading block. The error is cosmetic and does not affect functionality. A future fix could remove CSV.VLX from the TB11 Startup Suite registry entry or delete CSV.VLX from the TB11 directory.
VM 104 Step 2 Run History (Bug 35 fix progression):
Run |
Timestamp |
Mode |
Commit |
acaddoc.lsp |
Result |
Key Finding |
|---|---|---|---|---|---|---|
1 |
20260304-104736 |
PB11 |
|
|
❌ FAIL |
|
2 |
20260304-104933 |
TB11 |
|
|
✅ PASS |
csv.lsp loaded via Startup Suite. No VLX loading attempted (c:csv already defined) |
3 |
20260304-112402 |
PB11 |
|
|
✅ PASS |
VLX loaded successfully via |
4 |
20260304-112903 |
TB11 |
|
|
✅ PASS |
Noise: Startup Suite emits |
OCR data stored in: reports/ocr-output/vm102-p0-step2/ (Config B), reports/ocr-output/vm103-p0-step2/ (Config A), reports/ocr-output/vm104-step2-run-112402/ (Config C), reports/ocr-output/vm104-step2-run-112903/ (Config D). Earlier failed runs in reports/ocr-output/vm104-pb11-step2-run4/ and reports/ocr-output/vm104-tb11-step2-run2/.
Config E — VM 108 PB11 Step 2 (run 20260304-124910) — Bug 37 Discovery:
VM 108 (Win10 x32) was initially set to PB11 mode (CSV.VLX) for Step 2 testing. Steps 0 and 1 passed, but Step 2 revealed a fatal Win10 compatibility issue with the compiled VLX.
Screenshot |
Size |
OCR Key Content |
Instant-Fail? |
Verdict |
|---|---|---|---|---|
|
2,256 KB |
Menu bar: “ConstructiVision”. Clean startup |
No |
✅ |
|
2,256 KB |
Dialog: “ConstructiVision - Program Options”, “Version 11.00 2/26/2026” |
No |
✅ |
|
~100 KB |
“WinHelp failed.” + OK button |
Yes |
❌ |
|
~100 KB |
“WinHelp failed.” + OK button (same dialog, F2 blocked) |
Yes |
❌ |
|
~100 KB |
“WinHelp failed.” + OK button |
Yes |
❌ |
|
~100 KB |
“WinHelp failed.” + OK button |
Yes |
❌ |
Root cause (Bug 37): The compiled CSV.VLX calls WinHelp() internally during its cancel/cleanup code path. Windows 10 removed WinHlp32.exe from the OS entirely. On Win10, this call:
First produces a system error dialog (“WinHelp failed.” with OK button)
Then opens the default browser to a Microsoft download page for WinHelp
The browser launch is the critical issue — it steals focus from AutoCAD permanently, blocking ALL subsequent automation. On XP VMs (102, 103, 104), WinHelp is present and the call succeeds silently, so this problem never manifests.
Decision: Abandon PB11 (VLX) testing on Win10. The WinHelp dependency is compiled into the VLX binary and cannot be patched without full VLX recompilation. Since TB11 (source mode) does not call WinHelp, Win10 VMs will run TB11 exclusively. The _DismissWinHelp() au3 workaround was initially implemented but removed — suppressing a single dialog is insufficient when the VLX also launches a browser. VLX testing remains valid on XP VMs where WinHelp is available.
Configs E/F pivoted: Both Win10 configs (E=VM 108, F=VM 109) are now TB11 source-mode only. The config table above has been updated to reflect this.
OCR data: reports/ocr-output/vm108-step2-run1/ (PB11 failure run).
Config A–D Reruns (March 4, 2026 — updated au3 with 10s csv delay):
P0 Step 2 was rerun on all 4 XP configs using the updated cv-p0-step2.au3 (10s csv command wait for source-mode cold loading, commit 5402826). All reruns PASS:
Config |
VM |
Build |
Run Timestamp |
Result |
Notes |
|---|---|---|---|---|---|
A |
103 |
v7.0 |
|
✅ PASS |
Menu bar: ConstructiVision. Dialog: “Program Options”. No Express (v7.0) |
B |
102 |
v11 |
|
✅ PASS |
Menu bar: ConstructiVision. Dialog: “Program Options”. MENUGROUP=ACAD/EXPRESS/CSV |
D |
104 |
TB11 |
|
✅ PASS |
Menu bar: ConstructiVision. csv dialog with TB11 buttons. |
Config C (VM 104 PB11) was not rerun — PB11 validated in earlier run, only TB11 retested.
OCR data: reports/ocr-output/vm103-step2-rerun/, reports/ocr-output/vm102-step2-rerun/, reports/ocr-output/vm104-tb11-step2-rerun/.
Config E — VM 108 TB11 Step 2 (run 20260304-133520) — ✅ PASS:
VM 108 (Win10 x32) TB11 source-mode. All 6 screenshots clean, no errors.
Screenshot |
Size |
OCR Key Content |
Instant-Fail? |
Verdict |
|---|---|---|---|---|
|
2,550 KB |
Menu bar: “ConstructiVision”, “Express”. Clean startup, no errors |
No |
✅ |
|
2,550 KB |
Dialog: “ConstructiVision - Program Options” with TB11 buttons |
No |
✅ |
|
740 KB |
Clean command history. No errors |
No |
✅ |
|
740 KB |
|
No |
✅ |
|
740 KB |
MENUGROUP=ACAD, EXPRESS, CSV — all 3 present |
No |
✅ |
|
740 KB |
|
No |
✅ |
OCR data: reports/ocr-output/vm108-step2-rerun/.
Config F — VM 109 TB11 Steps 0–2 (run 20260304-133757 / 134305) — Steps 0–1 PASS, Step 2 ❌ FAIL:
VM 109 (Win10 x64) TB11 source-mode. Steps 0 and 1 PASS. Step 2 FAIL due to two issues:
Screenshot |
Size |
OCR Key Content |
Instant-Fail? |
Verdict |
|---|---|---|---|---|
|
6,024 KB |
Menu bar: “ConstructiVision”. |
Yes |
❌ |
|
6,024 KB |
|
Yes |
❌ |
|
1,191 KB |
Confirms: COMMERCIAL error at startup + progopts load failure |
Yes |
❌ |
|
1,191 KB |
|
— |
⚠️ |
|
1,191 KB |
MENUGROUP=ACAD, CSV (no EXPRESS — expected on x64) |
No |
✅ |
|
1,191 KB |
|
No |
✅ |
Root cause — Bug 38: ACAD path points to acad2000 subdirectory on x64:
The x64 build (src/x64/TB11-01x64/) contains a 648-file acad2000/ legacy subdirectory with stale copies of all source files. On VM 109, the ACAD registry path was configured to C:\Program Files (x86)\ConstructiVision\acad2000 instead of the root C:\Program Files (x86)\ConstructiVision.
This caused two failures:
COMMERCIAL error (Bug 29 variant):
acad2000/acad.lspauto-loads and does(load "csvmenu") (load "csv"), loading the staleacad2000/csvmenu.lspwhich still had#| |#block comments (the Bug 29 fix was only applied to the root-level csvmenu.lsp)progopts load failure: csv.lsp tries
(load "progopts")butprogopts.lspdoesn’t exist in theacad2000subdirectory — and the root ConstructiVision directory (where progopts.lsp lives) isn’t in the ACAD search path
Fixes applied:
Registry: Changed ACAD path on VM 109 from
...\ConstructiVision\acad2000to...\ConstructiVision(root dir)Repo: Fixed
src/x64/TB11-01x64/acad2000/csvmenu.lsp— replaced#| |#block comments with;;;(copied from fixed root-level csvmenu.lsp)Repo: Neutralized
src/x64/TB11-01x64/acad2000/acad.lsp— removed the(load "csvmenu") (load "csv")calls to prevent double-loading if acad2000 is ever in the ACAD path againScript: Updated
Configure-ConstructiVision.ps1to auto-detect x64 and clean up legacy\acad2000path entries
OCR data: reports/ocr-output/vm109-step0/, reports/ocr-output/vm109-step1/, reports/ocr-output/vm109-step2/.
Config F — VM 109 TB11 x64 Retest (1024x768, March 6, 2026):
After the x64 sync fix (394befd — 187 missing source files copied from x32 to x64), VM 109 was retested at 1024x768 resolution. The progopts load failure is resolved — bp_dlg.lsp, progopts.lsp, and all other modules now present in x64 build. The 1024x768 comparison screenshot shows a clean “ConstructiVision - Program Options” dialog with all 14 TB11 buttons visible.
Config F Step 2 status: Awaiting formal au3 pipeline rerun. The 1024x768 comparison run was a quick manual validation (au3 Step 2 only), not a full formal pipeline run with all 6 screenshots. A full formal rerun is needed to mark Config F Step 2 as PASS.
OCR data: reports/ocr-output/vm109-tb-1024/, reports/ocr-output/vm109-step2-rerun/.
1024x768 Apples-to-Apples Comparison (March 6, 2026):
All 4 VMs were standardized to 1024x768 resolution for true apples-to-apples OCR comparison. Previous Win10 runs used higher resolutions (VM 108: 1280x720, VM 109: 1920x1080), making screenshot sizes and OCR artifacts non-comparable with XP VMs.
Key finding: VM 104 confirmed running TB11-01x32. Junction points to TB11-01x32 (csvmenu.lsp 12508 bytes dated 03/02/2026 matches TB, PB11 has 2632 bytes dated 02/28/2026). The VLX LOAD failed error in the menubar is expected — TB has no CSV.VLX but csv loads successfully through source mode.
VM |
Config |
Resolution |
Dialog |
Version subtitle |
All buttons visible |
Notes |
|---|---|---|---|---|---|---|
102 |
B (V11-Golden) |
1024x768 |
“Program Options” (VLX) |
✅ Version 11.00 2/4/2013 |
No (8 buttons) |
WinXP Golden baseline |
104 |
D (TB11-01x32) |
1024x768 |
“Program Options” (source) |
❌ Missing (Bug 39) |
Yes (14 buttons, Bug 40) |
WinXP Clean Install |
108 |
E (TB11-01x32) |
1024x768 |
“Program Options” (source) |
❌ Missing (Bug 39) |
Yes (14 buttons, Bug 40) |
Win10 x32 |
109 |
F (TB11-01x64) |
1024x768 |
“Program Options” (source) |
❌ Missing (Bug 39) |
Yes (14 buttons, Bug 40) |
Win10 x64, after sync fix |
Bugs discovered:
Bug 39 (Medium): Version subtitle
ConstructiVision Tilt-Up and PreCast Software Version 11.00 2/4/2013not rendering in TB source-mode progopts dialog. The DCL file contains the text (verified identical on VM 104, 2925 bytes), but it does not render visibly. Consistent across all 3 TB VMs.Bug 40 (Medium): All 14 buttons visible in TB progopts dialog including Print Revision History and Print Materials List, even with no project loaded. PB11 VLX dialog shows only ~8 buttons appropriate for no-project context. The
progopts.lspcode checksolddwgto graymatlist/revhistbut may not be triggering correctly; additionally, PB11 hides buttons entirely rather than graying them.Bug 41 (Low):
CV Update.batVersion Manager shows “Active build: unknown” on XP becausediroutput on XP doesn’t include junction target in bracket notation.
OCR data: reports/ocr-output/vm102-pb11-1024/, reports/ocr-output/vm104-pb11-1024/ (directory name predates TB11 confirmation), reports/ocr-output/vm108-tb-1024/, reports/ocr-output/vm109-tb-1024/.
Step 3: Drawing Open + CV Behavior Verification¶
What this tests: CV behavior when opening existing project drawings. Opens CSB001.dwg (panel) and CSBsite1.dwg (site), runs csv on each, and captures the resulting dialogs. This catches:
Junction path resolution for Project Files
filedia=0OPEN command accepts full path through junctionsCV dialog routing differs by drawing type (panel vs site)
XRef resolution for site drawings
acaddoc.lspre-fires per document openMultiple
csvinvocations in one session
Prerequisites: Steps 0–2 must pass on this config.
Script: scripts/acad2000/cv-p0-step3.au3 (AC2000), scripts/acad2026/cv-p0-step3.au3 (AC2026)
Drawing paths:
Panel:
C:\Program Files\ConstructiVision\Project Files\ConstructiVision Sample Building\CSB001.dwgSite:
C:\Program Files\ConstructiVision\Project Files\ConstructiVision Sample Building\CSBsite1.dwg
Procedure:
Launch AutoCAD (Steps 0–2 passed)
Dismiss Startup dialog / trial popup, wait for initialization, maximize
Capture full-window screenshot — menu bar check (
p0-03-menubar.bmp)Phase A — Panel drawing:
(setvar "filedia" 0)thenopenwith CSB001.dwg pathWait for load, capture full-window screenshot (
p0-08-open-panel.bmp)Verify title contains “CSB001”
Send
(progn(c:csv)(princ))— capture dialog (p0-09-csv-panel.bmp)Dismiss dialog, capture F2 text window (
p0-09b-post-csv-panel.bmp)
Phase B — Site drawing:
openwith CSBsite1.dwg pathWait for load (longer — XRefs), capture full-window (
p0-10-open-site.bmp)Verify title contains “CSBsite”
Send
(progn(c:csv)(princ))— capture dialog (p0-11-csv-site.bmp)Dismiss dialog, capture F2 text window (
p0-11b-post-csv-site.bmp)
Phase C — Config dump (same as Step 2 but after 2 drawings opened):
ACAD path, PROJECTNAME, MENUNAME, CPROFILE →
p0-12-config-paths.bmpMenu groups (ACAD, EXPRESS, CSV) →
p0-13-config-menus.bmpStartup Suite + ARX list →
p0-14-config-startup.bmp
Close AutoCAD via File > Exit
Output files:
File |
Content |
|---|---|
|
Full window — menu bar with ConstructiVision |
|
Full window — CSB001.dwg loaded |
|
Full window — csv dialog on panel drawing |
|
F2 Text Window — post-csv command history (panel) |
|
Full window — CSBsite1.dwg loaded |
|
Full window — csv dialog on site drawing |
|
F2 Text Window — post-csv command history (site) |
|
F2 Text Window — ACAD, PROJECT, MENUNAME, PROFILE |
|
F2 Text Window — menu groups |
|
F2 Text Window — Startup Suite + END CONFIG |
|
Test execution log |
Pass criteria:
OCR of
p0-03-menubar.bmpcontains “ConstructiVision” in menu bar areaOCR of
p0-08-open-panel.bmpshows CSB001 panel geometry (not “Cannot find”)OCR of
p0-09-csv-panel.bmpshows a dialog (panel-specific: “Program Options”, “Panel Options”, or similar)OCR of
p0-10-open-site.bmpshows CSBsite1 site layout with XRefsOCR of
p0-11-csv-site.bmpshows a dialog (site-specific options)OCR of config dump screenshots match Step 2 baseline
No instant-fail indicators: “Cannot find”, “File not found”, “Unknown command”, “error”, “bad argument”
Screenshots: full-window >2MB, F2 text >500KB
Config |
Status |
Screenshots |
Notes |
|---|---|---|---|
A (103 v7.0) |
⏭️ Skipped |
Skipped — VM 102 passed; 103 only needed for bug checking |
|
B (102 v11) |
✅ PASS (Mar 6) |
Golden baseline — perfect |
|
C (104 PB11) |
✅ PASS (Mar 6) |
||
D (104 TB11) |
✅ PASS (Mar 6) |
Bug 39: version subtitle missing in progopts dialog |
|
E (108 TB11) |
✅ PASS (Mar 6) |
Bug 39: version subtitle missing in progopts dialog |
|
F (109 AC2000) |
✅ PASS (Mar 6) |
Bug 39: version subtitle missing in progopts dialog |
|
G (109 AC2026) |
✅ PASS (Mar 6) |
Bug 39: version subtitle missing in progopts dialog |
Steps 4–6: Incremental Progcont Decomposition — COVERED BY STEP 7¶
Steps 4, 5, and 6 were designed as incremental sub-tests that decompose Step 7’s 11-item menu sweep into smaller, isolatable pieces:
Step 4 (
cv-p0-step4.au3): Single progcont routing verification — tests progcont 1 (Drawing Setup) to prove the routing mechanism works before sweeping all 11 items.Step 5 (
cv-p0-step5.au3): File operation dialogs — tests 3 progcont values that produce file browser or project dialogs (Create New Project 262153, Create New Drawing 262161, Edit Existing Drawing 262145).Step 6 (
cv-p0-step6.au3): Utility commands — tests 7 remaining non-submenu items (Batch Utilities 262177, Slope Calculator 8193, Registration, Help, Web, About, Tech Support).
Status: ✅ Covered by Step 7 (Mar 6). Step 7 tests all 11 items (Steps 4+5+6 combined) and passed on configs B–G. Since Step 7 is a strict superset, Steps 4–6 are automatically satisfied. The individual au3 scripts remain available for targeted debugging if a specific progcont value regresses.
Step 12: Panel Detection — Site Layouts, Materials, Revision¶
What this tests: The 3 print/output items that depend on panel detection (Bug 63). These are the items that fail when csv.lsp cannot determine that the current drawing is a panel or site. Broken out from Step 11 to isolate panel detection failures from simple layer+print tests.
Depends on: Bug 63 fix (csv.lsp auto-detect of drawing type)
Script: scripts/acad2000/cv-p0-step12.au3 (AC2000), scripts/acad2026/cv-p0-step12.au3 (AC2026)
Diagnostic: Load csv_diag.lsp and run CSVDIAG command on VM with CSB001.dwg open before testing. F2 the output to verify detection state. See Bug 63 for details.
Submenu items tested:
# |
Sub-item |
csv.mnu ID |
progcont |
Detection Required |
|---|---|---|---|---|
7 |
Site > Select Layouts |
|
262657 |
Site drawing (CSBsite1) |
8 |
Materials List |
|
263169 |
Panel detected as panel |
9 |
Revision History |
|
264193 |
Panel detected as panel |
Procedure:
Launch AutoCAD, open CSB001.dwg
Run
CSVDIAG→ F2 → screenshot diagnostic outputItem 8 (Materials List): send progcont 263169 command → wait → screenshot
Item 9 (Revision History): send progcont 264193 command → wait → screenshot
Open CSBsite1.dwg
Item 7 (Site Select Layouts): send progcont 262657 command → wait → screenshot
Close AutoCAD
Output files: p0-12-07-print-sitelayout.bmp, p0-12-08-print-matlist.bmp, p0-12-09-print-revhist.bmp, p0-12-00-csvdiag.bmp + p0-step12-log.txt
Pass criteria:
CSVDIAGshowscsv_dwgtype: panelwhen CSB001.dwg is openCSVDIAGshows panel_list XRecord FOUND or Tier 2 layers (connections+solid+perimeter) EXISTSMaterials List (item 8) opens a materials dialog, NOT an error or empty response
Revision History (item 9) opens a revision dialog, NOT an error or empty response
Site Select Layouts (item 7) with CSBsite1.dwg open produces layout selection dialog
No instant-fail indicators (Cannot find, error, Function cancelled, bad argument)
Note
G1 trace result (March 20, 2026): On Config E (VM 108, TB11), the first successful G1 run showed:
pc=263169 (Materials List):
ssgetfound no INSERT entities on layer “0” in CSB001 →quit / exit abort(Bug 87 — fixed3e2a7c32: ssget moved beforeload_dialog; nil path exits cleanly via(princ))pc=264193 (Revision History): ✅ Clean —
Result nilpc=262657 (Site Select Layouts): ✅ Clean — enumerated plotters/papers/styles,
PLT_CHECK_OK → nil(no plotter, expected)
Bug 87 fix restructured matl_dlg.lsp to perform ssget before (load_dialog). The “No panel blocks found” alert still appears for bare panel drawings (correct behavior — CSB001 has no component INSERT entities on layer “0”). Materials List requires a site drawing with XREF panel inserts. Re-verification needed on VM 108 to confirm fix deployed.
Config |
Status |
Notes |
|---|---|---|
A (103 v7.0) |
⬜ |
|
B (102 v11) |
⬜ |
|
C (104 PB11) |
⬜ |
|
D (104 TB11) |
⬜ |
|
E (108 TB11) |
⬜ |
Bug 87 fixed ( |
F (109 AC2000) |
⬜ |
|
G (109 AC2026) |
⬜ |
Graduation: Full AutoIT Validation¶
Once Steps 0–12 all pass across all 7 configurations, THEN run the full cv-menu-validation.au3 script. At this point, every individual capability has been verified — the full script should pass because it’s just combining Steps 0–12 in sequence.
If the full au3 fails after Steps 0–12 all pass individually, the problem is timing/sequencing in the au3’s rapid-fire command execution, not a missing capability. Debug the timing, not the functionality.
0) Environment compatibility baseline (Sprint Weeks 4-5: Feb 10-24)¶
Operating System Matrix:
OS |
VM Required |
AutoCAD Versions |
Priority |
|---|---|---|---|
Windows XP SP3 |
Yes (32-bit) |
R14, 2000, 2002, 2004, 2005, 2006 |
High (legacy baseline) |
Windows Vista |
Yes |
2007, 2008, 2009 |
Medium (transition era) |
Windows 7 |
Yes |
2010, 2011, 2012, 2013 |
High (still in use) |
Windows 10 |
Host or VM |
2014-2024 |
Critical (primary target) |
Windows 11 |
Host or VM |
2022-2026 |
Critical (future-proof) |
AutoCAD Version Groups:
Group |
AutoCAD Versions |
OS Compatibility |
VLX Format |
|---|---|---|---|
Legacy 32-bit |
R14, 2000, 2002, 2004, 2005, 2006 |
XP, Vista |
VLX (32-bit) |
Transition |
2007, 2008, 2009 |
XP, Vista, 7 |
VLX (32-bit) |
Modern 32/64 |
2010, 2011, 2012, 2013 |
Vista, 7, 8 |
VLX (32/64-bit) |
64-bit Only |
2014-2026 |
7, 8, 10, 11 |
VLX (64-bit) |
Minimum Coverage Test Matrix (Updated Feb 28):
Test |
XP + ACAD 2000 |
Vista + ACAD 2000 |
Win7 + ACAD 2000 |
Win10x32 + ACAD 2000 |
Win10x64 + ACAD 2000/2026/NanoCAD 25 |
|---|---|---|---|---|---|
Installer runs |
✅ |
✅ |
✅ |
✅ |
✅ (Total Uninstall) |
Installer completes |
✅ |
✅ |
✅ |
✅ |
✅ (bypass 16-bit) |
AutoCAD detects CSV |
⚠️ Bug 1,3 |
⚠️ Bug 1,3 |
⚠️ Bug 1,3 |
✅ (auto-config) |
✅ (auto-config + registry fix) |
VLX loads |
✅ (manual) |
✅ (manual) |
✅ (manual) |
✅ (auto) |
✅ (auto) |
Menu appears |
⚠️ Bug 3 |
⚠️ Bug 3 |
⚠️ Bug 3 |
✅ (auto) |
✅ |
Basic command works |
✅ |
✅ |
✅ |
✅ |
✅ (after Wow6432Node fix) |
Dialog opens (DCL) |
✅ |
✅ |
✅ |
✅ |
✅ |
AutoIT validation |
✅ (VM 102/103/104) |
— |
— |
— |
— (manual testing only) |
NanoCAD 25 |
— |
— |
— |
— |
✅ (exploratory, LSP-only) |
AutoCAD 2026 |
— |
— |
— |
— |
✅ (VM 109) |
Uninstaller works |
⬜ |
⬜ |
⬜ |
⬜ |
⬜ |
Note (Feb 28): Win10x32 (VM 108) fully working with auto-configuration script (
scripts/Configure-ConstructiVision.ps1). Win10x64 (VM 109) fully working after Wow6432Node COM registry fix (103 CLSIDs, 104 Interfaces, 6 ProgIDs copied). Both x32 and x64 builds are crash-free. Source-mode testing active with deferred VLX loading viaacaddoc.lsp. VM 109 also running AutoCAD 2026 and NanoCAD 25 for forward-compatibility testing (see ConstructiVision TB11-01x64 — Architecture & Deployment). Total bugs across all platforms: 21 — see Bug Tracker — Validation Campaign.
Iterative Fix Cycle:
Test on Platform → Document Issues → Triage → Fix .lsp → Retest All
Source-mode testing: TB11 loads
.lspfiles directly (not compiled VLX), so fixes are instant — edit.lsp, restart AutoCAD, retest. No recompilation step needed.Regression test on ALL platforms after each fix round
Budget 2 weeks for iterative cycles (may extend)
All bugs tracked in Bug Tracker — Validation Campaign with DFMEA cross-references per 31 — Comprehensive Workflow & Human Factors Analysis
1) Smoke tests (fast, every build)¶
Smoke tests confirm that ConstructiVision loads and responds to basic commands. These run on every code change before deeper validation.
Automated (AutoIT): scripts/cv-menu-validation.au3 Phase 3 exercises the 6 top-level menu entries (csv, slope calculator, create/edit project/drawing, batch utilities) and confirms each dialog opens without error. This is the automated smoke test — if any of the 6 fail, the build is broken. See Software Development Lifecycle Phase 3a for the DFSS context.
Manual (source-mode): On Win10 VMs (108, 109), edit a .lsp file → restart AutoCAD → type csv at command line → confirm Program Options dialog appears. Source-mode loading means zero compilation delay.
Smoke test checklist:
Test |
Method |
Expected Result |
Time |
|---|---|---|---|
AutoCAD launches without crash |
SSH: |
Process running, no |
30s |
|
Type |
Program Options dialog appears |
5s |
All 93 modules loaded |
|
No |
10s |
Menu visible |
Visual check or AutoIT |
“ConstructiVision” menu in menu bar |
5s |
Panel drawing opens |
Open |
Panel displayed with title block |
10s |
Site drawing opens |
Open |
Site drawing displayed |
10s |
1.5) VLX Compilation Verification — COMPLETED (Week 3)¶
Goal: Prove we can rebuild VLX binaries from LSP source and validate against originals.
Status (Feb 28): VLX recompilation verified for v3.60. Source-mode testing (TB11) is now the primary development path — VLX compilation is deferred until the 65-workflow validation cycle (from 31 — Comprehensive Workflow & Human Factors Analysis Section 6) reaches stability. See Source-Mode Testing section above.
Completed verification:
✅ Compiled all LSP files → FAS → VLX using VLIDE (AutoCAD 2000 on VM 102)
✅ Compared compiled VLX size to original (v3.60: 1,103,600 bytes)
✅ Captured function lists via
(atoms-family 1)— original vs compiled✅ Diff confirmed: 109 FAS modules match (from binary analysis in v3.60 Source Recovery — Missing Dependency Fix)
✅ Basic commands verified with compiled VLX
Function List Capture Script:
(defun capture-vlx-functions (vlx-path output-file / baseline new-atoms fp)
(setq baseline (atoms-family 1))
(load vlx-path)
(setq new-atoms (vl-remove-if
'(lambda (x) (member x baseline))
(atoms-family 1)))
(setq fp (open output-file "w"))
(foreach a (vl-sort new-atoms '<)
(write-line (vl-princ-to-string a) fp))
(close fp)
(length new-atoms))
Results:
v3.60 VLX: 109 FAS modules confirmed
v11 (TB11): 126
.lsp+ 44.dclloaded in source-mode — VLX compilation deferredv7.0 VLX: Matches v11 module set (same product lineage, VM 103 baseline)
2) Regression tests (per fix cycle)¶
Regression testing validates that fixes don’t break previously-working functionality. The test matrix is derived from 31 — Comprehensive Workflow & Human Factors Analysis Section 6 (65 independently testable workflows).
Automated regression (AutoIT): After each fix commit, re-run cv-menu-validation.au3 on VMs 102, 103, and 104. Compare screenshots against baseline runs in reports/ocr-output/vm{102,103,104}-final/. The OCR pipeline (scripts/ocr-screenshots.py) extracts dialog text for automated diff.
Manual regression (priority workflows): Run the Priority 1 workflows from the Comprehensive Menu Validation section above:
Category |
Workflows |
Method |
Expected |
|---|---|---|---|
Core Panel |
P-01, P-02, P-03, P-13, P-14/15, P-20, P-21 |
Manual on VM 108/109 |
Panel generates with correct geometry |
Batch & Print |
B-01, B-02, B-07 |
Manual on VM 102 |
Batch completes, |
Engineering |
E-01, E-02 |
Manual on VM 102 |
CSV/MB files exported correctly |
Site Drawing |
S-01, S-05 |
Manual on VM 102 |
Site drawing created, panels attached |
Utility |
U-06 |
Manual on VM 104 |
Registration flow completes |
Cross-VM regression: After fixing any deployment-related bug (like Bugs 19–21), verify the fix on ALL active VMs (102, 104, 108, 109) before marking complete. The DFSS validation loop (doc 35 Phase 3) requires platform-independent verification.
OCR comparison method:
# Capture new screenshots
ssh Administrator@<VM_IP> "at HH:MM /interactive C:\run-validation.bat"
# Copy to developer workstation
scp Administrator@<VM_IP>:"C:\CV-Validation\VM102\*" reports/ocr-output/vm102-regtest/
# Run OCR pipeline
python scripts/ocr-screenshots.py reports/ocr-output/vm102-regtest/
# Diff against baseline
diff reports/ocr-output/vm102-final/ocr-output/summary.txt reports/ocr-output/vm102-regtest/ocr-output/summary.txt
3) Demo tests (release candidate)¶
Before any distribution milestone (App Store submission, alpha handoff), run the complete demo workflow end-to-end on a clean environment.
Demo script (full cycle):
Fresh AutoCAD 2000 launch (no prior CV state)
Load ConstructiVision via
csvmenu.lsp(source-mode) orcsv.vlx(compiled)Create new project → Create new panel drawing → Define panel with openings + weld connections + pick points
Generate panel → verify 3D solid, dimensions, title block
Create site drawing → Attach panel → Print site layout
Batch print all panels in project
Export engineering data (Dayton/Richmond CSV)
Print materials list → verify
matlist.txt
Demo platforms (Updated Feb 28):
Platform |
VM |
AutoCAD |
Status |
|---|---|---|---|
Windows XP SP3 |
VM 102 |
AutoCAD 2000 (v11) |
✅ Baseline — all 6 menu items pass |
Windows XP SP3 |
VM 104 |
AutoCAD 2000 (v3.60) |
✅ Deferred VLX loading (Bug 20 fix) |
Windows 10 32-bit |
VM 108 |
AutoCAD 2000 |
✅ Source-mode, auto-configured |
Windows 10 64-bit |
VM 109 |
AutoCAD 2000 (WoW64) |
✅ After Wow6432Node COM fix |
Windows 10 64-bit |
VM 109 |
AutoCAD 2026 |
⏳ Forward-compatibility testing |
Windows 10 64-bit |
VM 109 |
NanoCAD 25 |
⏳ Exploratory (LSP-only, no VLX/COM) |
Demo acceptance gate (from Software Development Lifecycle §6.4): Zero new Critical/High bugs during demo run. All 8 demo steps complete without user intervention or workarounds.
4) UI Reference Documentation — COMPLETED (Doc 31)¶
The UI reference documentation effort has been completed as 31 — Comprehensive Workflow & Human Factors Analysis (923 lines). This supersedes the original Week 3 placeholder plan.
What doc 31 provides (replacing the original menu-tree/command-reference deliverables):
Doc 31 Section |
Content |
Lines |
|---|---|---|
§3.1 Master Navigation Flow |
Complete menu routing from AutoCAD startup through |
107–200 |
§3.2–3.7 Workflow Maps |
6 detailed workflow maps: Panel creation, Panel editing, Batch processing, Site drawing, Engineering export/import, Materials list |
200–400 |
§4 Manual ↔ Code Cross-Reference |
45-row table mapping every CSV Manual section to DCL file, LSP file(s), menu ID, and status (38 verified, 6 concerns, 1 broken) |
405–465 |
§6 Critical Workflow Test Matrix |
65 independently testable workflows across 5 categories (P-01 to P-24, B-01 to B-08, E-01 to E-06, S-01 to S-07, U-01 to U-06) |
509–590 |
§8 Human Factors Analysis |
Nielsen’s 10 heuristics evaluation, complexity metrics (1,826 input fields, 65 dialogs), error probability analysis |
637–730 |
Appendix A Complete Call Graph |
Full function call graph for all major subsystems |
675–727 |
Appendix B Dialog Inventory |
44 DCL dialogs, ~5,736 controls — largest: |
874–923 |
Original deliverables — status:
→ Superseded by doc 31 §3 + §4 (45-row cross-reference with LSP/DCL mapping)menu-tree.md→ Superseded by AutoIT screenshot captures inui-reference/menu-tree/reports/ocr-output/vm{102,103,104}-final/→ Superseded by doc 31 §6 (65-workflow test matrix with entry points, steps, and exit criteria)command-reference.md
Golden datasets¶
The ConstructiVision Sample Building project serves as the golden dataset for validation. These are actual project files shipped with the product and used in the AutoIT validation runs.
Primary test artifacts (in src/Project Files/ConstructiVision Sample Building/):
File |
Type |
Purpose |
|---|---|---|
|
Panel drawings |
Individual panel definitions with full geometry |
|
Site drawing |
Site layout with grid lines and panel attachments |
|
Project file |
Project configuration and panel list |
Test usage:
AutoIT validation: Phase 1 opens
CSB001.dwg(panel), Phase 5 opensCSBsite1.dwg(site)Manual regression: Create new panel in Sample Building project, verify against existing panels
Batch testing: B-01/B-02 run against all panels in
CSBsite1.dwgEngineering export: E-01/E-02 export from Sample Building panels, verify CSV output
Baseline comparison data:
reports/ocr-output/vm102-final/— v11 reference screenshots (11 images, 6/6 pass)reports/ocr-output/vm103-final/— v7.0 reference screenshots (10 images, 6/6 pass)reports/ocr-output/vm104-final/— v7.0 patch reference screenshots (11 images, 6/6 pass) — successful re-run after Bug 20 fixreports/ocr-output/vm104-feb28/— v7.0 patch failed run (VLX not loaded, Bug 20) — retained for historical reference_extract/vm102-total-uninstall/— VM 102 filesystem + registry snapshot_extract/vm103-total-uninstall/— VM 103 filesystem + registry snapshot
Edge cases: Panels with all 20 features enabled simultaneously (P-20 workflow) stress-test the Boolean pipeline. The wc_dlg dialog with 615 controls across 5 pages is the densest UI path. See 31 — Comprehensive Workflow & Human Factors Analysis §8.3 for error probability analysis.
Output comparison strategy¶
Comparison leverages the OCR pipeline and cross-VM validation to detect regressions.
Automated (OCR text comparison):
Comparison |
Source |
Target |
Method |
|---|---|---|---|
Dialog titles |
|
New run OCR output |
Text diff — titles must match exactly |
Menu item count |
AutoIT |
New run log |
6/6 items must PASS |
Screenshot count |
Baseline: 11 (VM 102), 10 (VM 103), 11 (VM 104) |
New run |
Count must match or exceed baseline |
Manual (structured outputs):
matlist.txt — Materials list text file: compare line-by-line (quantities, hardware counts, concrete yardage)
Engineering CSV (dreng.lsp output) — Field-by-field comparison: panel dimensions, face codes (U/D), brace positions
DWG metadata — Compare layer names, block counts, title block attribute values via AutoLISP inspection
Panel dimensions — Tolerance-based: ± 0.001” for ft-in-fraction values (rangchck.lsp precision)
Cross-version comparison (3-VM):
The 3-VM strategy (VM 102 = v11, VM 103 = v7.0, VM 104 = v7.0 patch) enables version regression detection. All 3 VMs now pass AutoIT 6/6 — dialog content is identical across VMs 102 and 104. VM 103 (v7.0) has expected wording differences on Drawing Setup and Create New Drawing dialogs. The only visual discrepancy is the zoom/center of opened drawings — VM 104 captures the command line area in drawing screenshots while VM 102 captures the drawing viewport, likely due to desktop resolution or AutoCAD window layout differences. Doc 31’s 45-row cross-reference (§4) documents expected behavioral differences between versions.
File-Type Integrity Gates¶
ConstructiVision validation uses file-type-specific integrity gates instead of a single generic file-open check. A test passes only when both structural correctness and value integrity pass for the target file type.
File type |
Structural gate |
Semantic gate |
Value gate |
Tolerance rule |
Evidence artifact |
|---|---|---|---|---|---|
DWG |
Opens in target AutoCAD runtime; drawing class resolves correctly |
Named Object Dictionary contains expected CV key ( |
Panel/site identity, key dimensions, version markers, and required dictionary payload values match baseline |
Exact for keys/identifiers; numeric tolerance only for regenerated floating values |
Open log, dictionary inspection output, section reconstruction summary |
DXF |
|
Required layer names and entity families match exporter contract |
Geometry dimensions, placements, and layer mapping match source model |
Layer names/entity families exact; coordinates and bounding boxes within numeric tolerance |
Parsed header/layer/entity summary + geometric diff report |
CVP |
Envelope fields present: |
Full project payload coherent (settings, panels, optional site, references) |
Project metadata/settings/panel groups/site relationships preserved per import mode rules |
Exact except intentionally regenerated IDs/timestamps in new-project import mode |
Schema result + key-value diff + reference integrity report |
CVT |
Envelope fields present; |
Template payload includes reusable settings and panels |
Required normalization fields stripped; geometry/defaults preserved |
Exact for strip-list and retained defaults |
Strip-list verification + template-import reconstruction summary |
CVPANEL |
Envelope fields present; |
Complete |
Mark/dimensions valid; group presence/slot shape stable; enabled features preserve source values |
Group presence/enums exact; coordinate precision within numeric tolerance |
Panel schema + per-group diff |
CVSITE |
Envelope fields present; |
Site payload includes scale/grid/north arrow/placement list |
Placements and rotations preserved; panel references valid in project-coupled tests |
IDs/references exact; coordinate precision within numeric tolerance |
Placement-table diff + orphan-reference check |
Export succeeds and expected page count is generated for valid panels |
Required sections present per page (header, dimensions, openings/blockouts, embeds/connections, materials) |
Displayed marks/dimensions/material values align with recomputed source values |
Key labels/IDs exact; formatted numeric values within rounding tolerance |
Page/section inventory + extracted-value comparison |
Panel Group Coverage (DXF/PDF)¶
Coverage below reflects current CV-WEB exporters (dxfExport.ts, pdfExport.ts) and material calculations (materialsCalc.ts).
Panel group |
DXF coverage |
PDF coverage |
Materials coverage |
Gap status |
|---|---|---|---|---|
|
Yes |
Yes |
Yes/partial by group |
Covered |
|
Yes (feature rectangles) |
No dedicated section |
Partial |
Partial |
|
No geometry emitted |
Yes (Embeds & Connections table) |
Partial (count-focused) |
Partial |
|
No |
No |
No/partial depending on flow |
Not Yet Represented |
Release And Traceability Rules¶
A release candidate cannot claim roundtrip integrity coverage for a file type unless all mandatory gates for that file type pass with evidence.
Any panel group marked Not Yet Represented must be listed as an explicit scope exception and cannot be counted as validated coverage.
Any structural-gate or value-gate failure for interchange flows must be recorded in the TB11 bug tracker (
32-tb11-bug-tracker.md) and mapped to a DFMEA row in doc 31, or flagged as NEW when no row exists.
Regression Reversal Protocol¶
When a previously-passing test fails in a later validation run, the following protocol ensures evidence preservation and root-cause traceability.
Five-Step Regression Process¶
Preserve Evidence (Non-destructive)
Download and archive the first-pass screenshot set from
reports/ocr-output/<first-pass-run-id>/Download and archive the failing/review screenshot set from
reports/ocr-output/<regression-run-id>/Keep both directories in
reports/with clear naming: e.g.,reports/ocr-output/vm108-g3-20260313-pass/andreports/ocr-output/vm108-g4-20260315-regression/Never overwrite or delete the first-pass evidence
Mark Status as REGRESSION
Update the test record in the validation step table with status = 🔄 REGRESSION
In the “Notes” column, add: “Previously PASS on [first-pass-date]; now FAIL on [regression-date]. See bug #[N].”
Example:
🔄 REGRESSION: Previously PASS on 2026-03-13; now FAIL on 2026-03-15. See bug #93.
Open or Link Bug Tracker Entry
Create a new GitHub Issue or update existing bug entry in 32-tb11-bug-tracker.md
Title format:
[Regression] Test [step-N] failed after previously passing on [date] (expected dialog now missing/changed)Body must include:
First Pass Date: [date]
Regression Date: [date]
File(s) Affected: list source files
Symptom: concrete OCR text or screenshot diff between first pass and regression runs
Impact: which release gate is blocked
Map to DFMEA and Assess RPN
Check 31-comprehensive-workflow-analysis.md Section 9 for existing failure mode that matches the regression
If match found: Add bug # to DFMEA Cross-Reference table in doc 32 with Match = “Yes, regression”
If NO match: Add new DFMEA row with S/O/D ratings and note as “NEW — unanticipated failure mode” in doc 31, then cross-reference in doc 32
Recalculate RPN; if RPN now exceeds threshold, escalate release readiness decision
Block Release Gate Until Resolved
Mark the affected step (Step 0–Step 12) with status = ❌ BLOCKED in the release readiness checklist
Do NOT increment release milestone or close integration gates until regression is fixed AND re-validated with new first-pass evidence
Rationale: A regression indicates either a codebase change introduced a defect, or the golden baseline was invalid. Either case requires corrective action before release.
Recommended Table Structure for Regression Tracking¶
Add a “Regression History” subsection in each step’s validation table:
First Pass |
Reviewed |
Status |
Regression Flag |
Bug Ref |
|---|---|---|---|---|
2026-03-13 |
2026-03-13 |
✅ PASS |
— |
— |
— |
2026-03-15 |
🔄 REGRESSION |
Pre=PASS, Now=FAIL |
#93 |
2026-03-16 |
2026-03-16 |
✅ PASS (Fix verified) |
Resolved |
#93 |
When NOT to Apply Regression Protocol¶
If a test was never expected to pass, mark as ❌ FAIL (not REGRESSION)
If test environment was materially changed (e.g., new VM, new AutoCAD version, fresh OS install), document the environmental change separately; regression flag applies only to software changes in the TB11 codebase or deployment configuration
If test is on a different file type or different panel group than prior runs, treat as independent validation, not regression
Validation Completion Arc Before Security Testing¶
Security testing is blocked until the full validation arc below is complete and evidenced.
Gate |
Required evidence |
Planned closure window |
|---|---|---|
Step completion gate |
Steps 0-12 complete for active configs with OCR/deterministic evidence |
Apr 7-Apr 21, 2026 (M2) |
Regression closure gate |
All required reruns complete with First Pass + Reviewed pairs; open regressions triaged in bug tracker |
Apr 7-Apr 21, 2026 (M2) |
Integration quality gate |
x32/x64 output quality verified on stabilized baseline |
May 19-May 26, 2026 (M2.5) |
Design readiness gate |
Documentation and design-phase prerequisites complete on validated baseline |
May 31, 2026 (M3) |
Security testing gate |
Security campaign can start only after the four gates above are closed |
Jun 1, 2026 (M4 start) |
Release-readiness rule: if any pre-security gate is open, release status remains blocked and security testing start is deferred.
Acceptance criteria¶
Acceptance criteria are derived from the DFSS Voice of the Customer → CTQ matrix (see Software Development Lifecycle §9.1).
Must match exactly:
Criterion |
Measurement |
Source |
|---|---|---|
All 6 top-level menu commands respond |
AutoIT Phase 3: 6/6 PASS |
|
All 93 modules load without error |
|
AutoCAD command line |
Panel solid generation completes |
|
File existence check |
Engineering export matches baseline |
Field-by-field CSV comparison |
|
Menu registration correct |
|
|
Can vary within tolerance:
Criterion |
Tolerance |
Reason |
|---|---|---|
Panel dimensions |
± 0.001” |
ft-in-fraction parsing precision |
Center of gravity (centgrav.lsp) |
± 0.01 units |
MASSPROP floating-point variation |
Materials list quantities |
Exact match |
Integer counts — no tolerance |
Batch completion time |
± 50% |
VM performance varies |
Acceptable modernization deltas:
Delta |
Status |
Justification |
|---|---|---|
DCL dialog titles differ between v3.60 and v11 |
✅ Expected |
v3.60 used generic “AutoCAD 2000” titles; v7.0+ set explicit dialog labels |
WinHelp ( |
✅ Known (DFMEA #10, RPN=500) |
Help system replacement is a future milestone. Bug 37: VLX’s internal |
4 menu items route to same dialog |
✅ Known (DFMEA #13, BRK-05) |
|
NanoCAD 25: no VLX/ARX/COM support |
✅ Expected |
LSP-only fork ( |
Tools & automation¶
Active testing tools (deployed and validated):
Tool |
Version |
Location |
Purpose |
|---|---|---|---|
AutoIT |
3.3.16 |
|
Automated UI testing — |
OCR pipeline |
Python + Tesseract |
|
BMP→PNG conversion + text extraction for screenshot comparison |
Total Uninstall |
6.18.0 |
|
Filesystem + registry snapshot before/after install |
Regshot |
1.9.0 |
|
Quick registry diff for targeted changes |
InCtrl5 |
5.0 |
|
Before/after install snapshot comparison |
SSH + |
XP built-in |
All VMs via Bitvise SSH Server |
Remote execution of AutoIT scripts (only method for interactive GUI on XP) |
Configure-ConstructiVision.ps1 |
— |
|
Automated environment setup on Win10 VMs |
Testing infrastructure (from Software Development Lifecycle §3):
Component |
Purpose |
DFSS Phase |
|---|---|---|
AutoIT automation |
Validate $Y$ (menu response) against specification |
Phase 3: Validate |
OCR text extraction |
Detect regressions by comparing dialog text across runs |
Phase 3: Validate |
Total Uninstall snapshots |
Diagnose $X$ (filesystem/registry state) when $Y$ fails |
Phase 4: Analyze |
SSH remote commands |
Measure $X$ on VMs without manual login |
Phase 4: Analyze |
DFMEA table (doc 31 §9) |
Predict failures before testing — prioritize by RPN |
Phase 5: Optimize |
Bug tracker (doc 32) |
Bidirectional traceability: bug → DFMEA → test → fix |
All phases |
Competitive Parity Validation Plan¶
Overview¶
This section defines the epic-level validation plan to ensure ConstructiVision (CV-CAD + cv-web) reaches competitive parity with the established industry benchmark product no later than July 1, 2026. The field-level checklist in 41. Competitive Parity Checklist — ConstructiVision vs Industry Benchmark provides 674 individually verifiable capability rows across 14 domains — each gap item maps to a story in this plan.
Terminology: Throughout this plan, “industry benchmark” (IB) refers to the leading competing tilt-up panel design software product against which ConstructiVision capabilities are measured. See 41. Competitive Parity Checklist — ConstructiVision vs Industry Benchmark for the complete field-level assessment.
Gate Thresholds¶
Level |
Threshold |
Target Date |
Status |
|---|---|---|---|
App Store Release |
P0 domains ≥ 90% CV-CAD parity |
Jun 1 |
Not started |
cv-web Alpha |
P0 domains ≥ 80% cv-web parity to CV-CAD |
Jun 15 |
Not started |
cv-web Beta |
P0 + P1 domains ≥ 75% cv-web parity |
Jun 27 |
Not started |
Competitive Parity |
Engineering domain (D4) ≥ 20% |
Jul 1 |
Not started |
Current Baseline (April 2026)¶
Product |
Total Matched / 795 |
Overall % |
|---|---|---|
IB |
301 |
38% |
CV-CAD |
568 |
71% |
cv-web |
431 |
54% |
Note
CV-CAD leads in 10/14 domains. cv-web leads in 3 domains (Import/Roundtrip, UI/UX, Plans Data). IB leads only in Engineering & Structural (D4 at 93%). The combined CV platform (CAD + web) covers 87% of all items vs IB’s 38%.
Validation Method¶
Automated: cv-web dialog inventory compared against ActiveDialog union type (47 types) — verify each dialog renders and accepts input.
Manual: CV-CAD panel/site/project entity fields verified against checklist via AutoIT menu validation on VMs 102/108.
Evidence-based: Each ✅/⚠️/❌ assessment in doc 41 must trace to a primary source artifact listed in the Evidence Chain table.
Periodic recount: Domain rollup percentages recalculated at end of each sprint. Each epic carries a “last recalculated” date.
Sprint Calendar¶
Six two-week sprints from April 7 through July 1, organized by validation priority.
Sprint |
Dates |
Focus |
Epics |
Validation Gate |
|---|---|---|---|---|
1 |
Apr 7–18 |
CV-CAD P0 validation — project & panel geometry proof |
E-01, E-02 |
CV-CAD P0 baseline |
2 |
Apr 21 – May 2 |
CV-CAD drawing, export, hardware validation |
E-05, E-06, E-08 |
CV-CAD coverage expansion |
3 |
May 5–16 |
cv-web site planning + batch gap closure |
E-03, E-07 |
cv-web infrastructure |
4 |
May 19–30 |
cv-web drawing production + calculated properties |
E-05, E-10 |
cv-web feature depth |
5 |
Jun 2–13 |
cv-web export, platform, import/roundtrip |
E-06, E-09, E-11 |
App Store + Alpha gates |
6 |
Jun 16–27 |
Integration validation + final gate scoring |
E-12, E-14, all |
Beta + Parity gates |
— |
Jun 30 – Jul 1 |
Buffer — final gate review and release decision |
All |
Release decision |
Epic Map¶
Fourteen epics — one per domain from 41. Competitive Parity Checklist — ConstructiVision vs Industry Benchmark. Each epic lists its P0/P1 gap stories, sprint assignment, and July 1 target. Post-launch epics (D4, D13) are documented in full but out of July 1 scope.
E-01: Project Management Parity (Domain 1 — 40 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
95% (38/40) |
83% (33/40) |
July 1 Target |
98% |
90% |
Gap Count (❌) |
1 |
9 |
Sprint |
1 |
3 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-1.1 — Project metadata fields: Contractor name (#8)
F-1.2 — Drawing/scale settings: Drawing scale (#23), reinforcing drawing scale (#24)
F-1.3 — Cloud collaboration: Cloud project list, team sharing, permission tiers (#36-40) — P2, post-launch
P0/P1 Stories:
# |
Capability |
Product |
Priority |
|---|---|---|---|
8 |
Contractor name field |
CV-CAD, cv-web |
P1 |
23 |
Drawing scale |
cv-web |
P1 |
P2 stories: 7 items (doc 41 #34-40). Post-launch scope.
E-02: Panel Geometry Parity (Domain 2 — 135 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
96% (115/120) |
93% (110/120) |
July 1 Target |
98% |
95% |
Gap Count (❌) |
5 |
9 |
Sprint |
1 |
1 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-2.1 — Panel shape extensions: Top peak distance (#53), top peak elevation (#54)
F-2.2 — Opening enhancements: Future opening flag (#85), expansion gap (#74)
F-2.3 — Feature strip/pilaster refinement: Feature strip type (#124), pilaster/lintel slots
P0/P1 Stories:
# |
Capability |
Product |
Priority |
|---|---|---|---|
53 |
Top peak distance |
CV-CAD, cv-web |
P1 |
54 |
Top peak elevation |
CV-CAD, cv-web |
P1 |
74 |
Expansion gap |
cv-web |
P1 |
85 |
Future opening flag |
cv-web |
P1 |
P2 stories: 5 items. Post-launch scope.
E-03: Site Planning Parity (Domain 3 — 78 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
94% (60/65) |
23% (18/65) |
July 1 Target |
95% |
60% |
Gap Count (❌) |
0 |
60 |
Sprint |
3 |
3 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-3.1 — Grid configuration: X/Y-axis label, direction controls, number/letter grid lines (#176-185) — 10 items
F-3.2 — Wall line management: Wall start/end positions, elevations, panel assignments (#186-196) — 11 items
F-3.3 — Slab & infrastructure: Slab lines, grid refs, thickness (#197-200) — 4 items
F-3.4 — Site layers: XY grid, walls, slab, connections, footings, hardware, dimensions (#201-208) — 8 items
F-3.5 — Site operations: Detach panels, tilt-up sequence, layout, save (#212-215) — 4 items
F-3.6 — Site dialogs: Grid, wall, slab, footing, column, weld dialogs (#216-221) — 6 items
F-3.7 — View/print controls: View, print, rename (#222-227) — 6 items
P0/P1 Stories (cv-web): ~51 items across features F-3.1 through F-3.7. See doc 41 items #176-227 for full listing.
Note
Largest cv-web gap. Site planning has 60 ❌ items in cv-web — this is the single largest domain gap and the primary Sprint 3 focus. CV-CAD already covers 94% of this domain.
E-04: Engineering & Structural Parity (Domain 4 — 127 items) — POST-LAUNCH¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
12% (15/127) |
5% (6/127) |
July 1 Target |
20% |
10% |
Gap Count (❌) |
105 |
115 |
Sprint |
— |
— |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-4.1 — Load inputs: Design/construction wind, seismic, roof/floor loads (#254-276) — 23 items
F-4.2 — Material properties: Concrete/steel properties, joint width, form dimensions (#277-284) — 8 items
F-4.3 — Reinforcement design: Bar size/spacing ranges, cover, deflection limits (#285-296) — 12 items
F-4.4 — Design methodology: ACI 318, P-Delta, load combinations, code selection (#297-309) — 13 items
F-4.5 — Strip analysis outputs: Strip dimensions, stress, moments, deflection, safety factors (#310-342) — 33 items
F-4.6 — Engineering actions: Auto-design, check, optimize, manual override (#343-348) — 6 items
F-4.7 — Error/warning messages: Engineering error dialogs, pass/fail indicators (#349-354) — 6 items
F-4.8 — Advanced features: Insulated panels, support types, two-story design (#355-363) — 9 items
Warning
Post-launch scope. Engineering (D4) is the IB’s strongest domain (93%) and ConstructiVision’s largest gap. The IB product is a structural engineering tool at its core — ConstructiVision is an estimating/drafting tool. Full engineering parity requires a dedicated structural analysis engine and is explicitly deferred to the post-launch roadmap. The July 1 gate requires only D4 ≥ 20%.
P0 stories: #254 (design wind load), #259-260 (roof dead/live load), #268-269 (floor dead/live load), #277-278 (concrete strength/density), #312 (structural thickness), #324 (bar quantity/size), #329-330 (factored/resisting moment), #342 (stability safety factor). Total: ~12 P0 items.
P1 stories: ~80 items (#255-341 range). See doc 41 for full listing.
E-05: Drawing Production Parity (Domain 5 — 55 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
95% (52/55) |
33% (18/55) |
July 1 Target |
98% |
65% |
Gap Count (❌) |
1 |
37 |
Sprint |
2 |
4 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-5.1 — Panel drawing automation: Automated panel drawing creation (#364), opposite hand (#366)
F-5.2 — Dimensioning: Auto-dimensioning pipeline (#367-374) — 8 items
F-5.3 — Title block: Title block generation and fields (#376-383) — 8 items
F-5.4 — Layer control: 7-layer subset system (#385-391) — 7 items
F-5.5 — 3D rendering: Viewpoints, render modes (#392-401) — 10 items
F-5.6 — Reinforcing DXF: Reinforcing design and placing DXF export (#407-408) — 2 items
P0/P1 Stories (cv-web):
# |
Capability |
Priority |
|---|---|---|
364 |
Automated panel drawing creation |
P0 |
366-374 |
Opposite hand + dimensioning pipeline |
P1 |
376-383 |
Title block generation & fields |
P1 |
385-391 |
Layer control (7-subset system) |
P1 |
407-408 |
Reinforcing design/placing DXF |
P1 |
E-06: Export & Output Parity (Domain 6 — 45 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
71% (32/45) |
49% (22/45) |
July 1 Target |
80% |
65% |
Gap Count (❌) |
13 |
23 |
Sprint |
2 |
5 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-6.1 — Print operations: Print single/all/selected panels (#411-413) — 3 items
F-6.2 — Report generation: Material quantities, engineering report, summary, cut list (#418-425) — 8 items
F-6.3 — Data exchange: Export/import data file (#426-427) — 2 items
F-6.4 — Panel grouping: Auto-detect like panels (#434-435) — 2 items
F-6.5 — BIM integration: IFC/BIM export (#431-433) — P2, 3 items
P0/P1 Stories:
# |
Capability |
Product |
Priority |
|---|---|---|---|
411-413 |
Print single/all/selected panels |
cv-web |
P1 |
418-420 |
Material quantity reports |
cv-web |
P1 |
421-425 |
Engineering report, summary, cut list, CRSI, rebar export |
CV-CAD, cv-web |
P1 |
426-427 |
Export/import data file |
cv-web |
P1 |
434-435 |
Auto-detect like panels |
CV-CAD, cv-web |
P1 |
E-07: Batch & Productivity Parity (Domain 7 — 35 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
86% (30/35) |
23% (8/35) |
July 1 Target |
90% |
50% |
Gap Count (❌) |
5 |
27 |
Sprint |
3 |
3 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-7.1 — Batch operations: Batch print/export utilities (#439-442) — 4 items
F-7.2 — Batch scope: Scope controls — all/selected/range (#443-445) — 3 items
F-7.3 — Panel operations: Opposite hand mirror, clone wall (#446-447) — 2 items
F-7.4 — Revision tracking: Revision date/description/sheet (#452-454) — 3 items
P0/P1 Stories (cv-web):
# |
Capability |
Priority |
|---|---|---|
439-442 |
Batch print/export utilities |
P1 |
443-445 |
Batch scope controls |
P1 |
446 |
Opposite hand mirror |
P1 |
447 |
Clone wall (full copy) |
P1 |
E-08: Hardware & Connections Parity (Domain 8 — 75 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
96% (72/75) |
87% (65/75) |
July 1 Target |
98% |
92% |
Gap Count (❌) |
3 |
10 |
Sprint |
2 |
2 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-8.1 — Steel properties: Bar mark, steel weight, yield strength (#510-513)
F-8.2 — Connection refinement: Weld plate details, anchor bolt specs
P0/P1 Stories (cv-web):
# |
Capability |
Priority |
|---|---|---|
510 |
Bar mark |
P1 |
512 |
Steel weight |
P1 |
513 |
Yield strength |
P1 |
E-09: Platform & Infrastructure Parity (Domain 9 — 30 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
93% (28/30) |
60% (18/30) |
July 1 Target |
95% |
75% |
Gap Count (❌) |
2 |
12 |
Sprint |
5 |
5 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-9.1 — Data persistence: Local storage, backup/restore
F-9.2 — Cloud infrastructure: Cloud storage, user login, RBAC, project dashboard (#517, #524-526) — P2, deferred
P0/P1 Stories: Minimal — most cv-web gaps are P2 (cloud features). Focus is on solidifying local persistence and offline-first architecture.
E-10: Calculated Properties & Warnings Parity (Domain 10 — 35 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
57% (20/35) |
46% (16/35) |
July 1 Target |
70% |
60% |
Gap Count (❌) |
15 |
19 |
Sprint |
4 |
4 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-10.1 — Structural indicators: Slenderness ratio (#540), stability safety factor (#541), max structural thickness (#542)
F-10.2 — Material quantities: Concrete volume, rebar weight, hardware count (#545-550)
F-10.3 — Cost estimation: Unit costs, total project cost (#548-550)
F-10.4 — Live summary: Physical summary while editing (#543), computed dimensions (#557-560)
F-10.5 — Engineering warnings: Pass/fail indicators per strip (#554)
P0/P1 Stories:
# |
Capability |
Product |
Priority |
|---|---|---|---|
540 |
Slenderness ratio |
CV-CAD, cv-web |
P1 |
541 |
Stability safety factor |
CV-CAD, cv-web |
P1 |
542 |
Max structural thickness |
CV-CAD, cv-web |
P1 |
543 |
Live physical summary while editing |
CV-CAD |
P1 |
545-550 |
Material quantities & cost |
CV-CAD, cv-web |
P1 |
554 |
Engineering pass/fail per strip |
CV-CAD, cv-web |
P1 |
E-11: Import & Roundtrip Parity (Domain 11 — 25 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
32% (8/25) |
72% (18/25) |
July 1 Target |
45% |
80% |
Gap Count (❌) |
17 |
7 |
Sprint |
5 |
5 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-11.1 — cv-web → CV-CAD roundtrip: CVP format import/export, field mapping validation
F-11.2 — Data safety: Auto-save, snapshots (#574-575)
F-11.3 — External formats: DXF/DWG import, IFC roundtrip (P2)
P0/P1 Stories (CV-CAD):
# |
Capability |
Priority |
|---|---|---|
574 |
Auto-save |
P1 |
575 |
Snapshots |
P1 |
Note
cv-web leads this domain (72% vs CV-CAD 32%). The cv-web import adapter and CVP format support are already strong. Focus is on improving CV-CAD’s import capabilities and cross-platform data safety.
E-12: UI/UX & Workflow Parity (Domain 12 — 65 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
80% (52/65) |
85% (55/65) |
July 1 Target |
85% |
90% |
Gap Count (❌) |
13 |
10 |
Sprint |
6 |
6 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-12.1 — Navigation modes: Project/building/elevation modes (#626-629)
F-12.2 — View controls: Wall display type toggle (#630), zoom/pan workflow
F-12.3 — Workflow refinement: Dialog flow optimization, keyboard shortcuts
P0/P1 Stories (CV-CAD):
# |
Capability |
Priority |
|---|---|---|
626-629 |
Project/building/elevation navigation modes |
P1 |
P0/P1 Stories (cv-web): Minimal — cv-web already leads this domain at 85%.
E-13: Plans Data (Domain 13 — 28 items) — POST-LAUNCH¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
0% (0/28) |
100% (28/28) |
July 1 Target |
N/A |
100% |
Gap Count (❌) |
28 |
0 |
Sprint |
— |
— |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Note
cv-web exclusive domain. Plans Data is a cv-web-only feature set (PlansData interface). CV-CAD’s lack of coverage here is by architectural design — this capability has no AutoCAD equivalent. No action required.
E-14: Engineering Export & Lift Integration Parity (Domain 14 — 22 items)¶
Metric |
CV-CAD |
cv-web |
|---|---|---|
Current |
82% (18/22) |
5% (1/22) |
July 1 Target |
85% |
25% |
Gap Count (❌) |
4 |
17 |
Sprint |
6 |
6 |
Last Recalculated |
Apr 3, 2026 |
Apr 3, 2026 |
Features:
F-14.1 — Lift firm export: Dayton/Richmond/Meadow-Burke export formats (#667-669) — CV-CAD ⚠️ (removed in v7.0)
F-14.2 — Lift analysis basics: Panel CG location, weight, tilt angle (#653-656)
F-14.3 — Lift design outputs: Insert locations, strongback requirements (#657-665)
F-14.4 — Case study templates: Sample project walkthroughs (#670-673)
P0/P1 Stories (cv-web):
# |
Capability |
Priority |
|---|---|---|
653-656 |
Panel CG, weight, tilt angle |
P1 |
657-665 |
Insert locations, strongback requirements |
P1 |
Epic Summary¶
Epic |
Domain |
Items |
CV-CAD |
cv-web |
CV-CAD ❌ |
cv-web ❌ |
Sprint |
Scope |
|---|---|---|---|---|---|---|---|---|
E-01 |
Project Management |
40 |
95% |
83% |
1 |
9 |
1, 3 |
Launch |
E-02 |
Panel Geometry |
135 |
96% |
93% |
5 |
9 |
1 |
Launch |
E-03 |
Site Planning |
78 |
94% |
23% |
0 |
60 |
3 |
Launch |
E-04 |
Engineering & Structural |
127 |
12% |
5% |
105 |
115 |
— |
Post-launch |
E-05 |
Drawing Production |
55 |
95% |
33% |
1 |
37 |
2, 4 |
Launch |
E-06 |
Export & Output |
45 |
71% |
49% |
13 |
23 |
2, 5 |
Launch |
E-07 |
Batch & Productivity |
35 |
86% |
23% |
5 |
27 |
3 |
Launch |
E-08 |
Hardware & Connections |
75 |
96% |
87% |
3 |
10 |
2 |
Launch |
E-09 |
Platform & Infrastructure |
30 |
93% |
60% |
2 |
12 |
5 |
Launch |
E-10 |
Calculated Properties |
35 |
57% |
46% |
15 |
19 |
4 |
Launch |
E-11 |
Import & Roundtrip |
25 |
32% |
72% |
17 |
7 |
5 |
Launch |
E-12 |
UI/UX & Workflow |
65 |
80% |
85% |
13 |
10 |
6 |
Launch |
E-13 |
Plans Data |
28 |
0% |
100% |
28 |
0 |
— |
Post-launch |
E-14 |
Eng Export & Lift |
22 |
82% |
5% |
4 |
17 |
6 |
Launch |
— |
TOTAL |
795 |
71% |
54% |
212 |
355 |
Gate Scoring Integration¶
Domain rollup scores from 41. Competitive Parity Checklist — ConstructiVision vs Industry Benchmark feed directly into the competitive parity gate in 37. QA Autopilot Audit And Risk Gates Phase 2 scoring. At the end of each sprint:
Recount — Recalculate domain rollup %s for all 14 domains. Update the “Last Recalculated” date on each epic.
Score — Evaluate the 4-tier gate threshold table against updated rollup numbers.
Report — Update the gate scoreboard in doc 37 with current pass/fail status per tier.
Adjust — If a sprint under-delivers, reassign stories to the next sprint and document the deviation.
Note
The competitive parity gate is one of five gates evaluated during the QA autopilot audit (doc 37). A gate failure here blocks the corresponding release milestone. See doc 37 Phase 2 for the full gate evaluation procedure.
Immediate next steps¶
Completed: 5-Week Sprint (Jan 21 – Feb 24) ✅¶
See installer-rebuild-sprint.md for the detailed day-by-day plan. (archived)
Week 1 (Jan 21-27): v3.60 Installer Rebuild¶
Setup InstallShield 5.x in Windows XP VM
Import
setup.rul(4,163 lines decompiled InstallScript)Compile and test installer in XP VM
Create pre/post install snapshots
Run WinDiff to document all system changes
Fix known bugs (registry profile detection, AutoCAD running check)
Week 2 (Jan 27 – Feb 3): v7.0 Reverse Engineering¶
Inspect existing v7.0/v11 desktop installation
Export registry keys and file listings
Compare v3.60 vs v7.0 differences
Document target state specification
Draft v7.0 installer script (deferred — WiX approach aborted)
Weeks 4-5 (Feb 10-24): OS/AutoCAD Compatibility Testing¶
Setup VMs: XP SP3, Vista, Win 7, Win 10 (32-bit and 64-bit)
Install appropriate AutoCAD versions
Run test matrix (8 tests × 5 platforms)
Document all issues found — see live totals in Bug Tracker — Validation Campaign
Iterative fix cycles (source-mode .lsp fixes, retest)
Regression test final build on all platforms
Week 6 (Feb 24-28): AutoIT Validation + Documentation¶
Develop
cv-menu-validation.au3(572 lines)Run automated validation on VMs 102, 103, 104 (3 final runs + 8 debug runs)
OCR pipeline — screenshots → text extraction →
reports/ocr-output/Cross-reference AutoIT results with doc 31 workflow matrix
Deploy deferred VLX loading to VM 104 (Bug 20 fix)
VM 109 CAD workstation — AutoCAD 2026 + NanoCAD 25 installed
NanoCAD 25 fork —
nanocad25/subdirectory with reduced 150-file setDoc 35 SDLC document (1,007 lines)
Doc 06 testing strategy overhaul (this document)
Remote Access Setup (parallel, by Feb 14)¶
Enable SSH + RDP for alpha testers (Tai, Dat)
Document access procedures —
scripts/Deploy-AlphaVM.ps1Setup audit logging (deferred)
Weeks 7–8 (Mar 3–14): P0 Incremental Baseline Validation — ACTIVE¶
P0 validates one capability at a time across 7 configurations (A–G), never skipping ahead. See Test layers → Priority 0 for full procedures and OCR evidence.
Step 0: AutoCAD launches cleanly (7/7 configs PASS — Mar 3–4)
Step 1: Standardize window layout (7/7 configs PASS — Mar 3–4)
Step 2: CV initialization verification (7/7 PASS — Mar 4–6)
Step 3: Drawing open + CV behavior verification (6/7 PASS — B–G, Mar 6; A skipped)
Steps 4–6: Incremental progcont decomposition — covered by Step 7 (Step 4 = single progcont, Step 5 = file dialogs, Step 6 = utilities; all 11 items tested in Step 7’s sweep, B–G PASS Mar 6)
Step 7: Simple menu items — all non-submenu items in one test (6/7 PASS — B–G, Mar 6). Config A not yet tested.
Step 8: Change 3D Viewpoint submenu — 10 viewpoint commands (5/7 PASS — B, D–G, Mar 6–8). A/C not tested.
Step 9: View Layers submenu — PINNED (Mar 9). Single-panel layer commands work (D–G). Multi-story layer support broken; needs multi-story drawing to test. Site dialog PINNED (freeze/thaw).
Step 10: Shading submenu — Off/Hidden/Shaded (5/7 PASS — B, D–G, Mar 9). A/C not tested.
Step 11: Print submenu — panel print items 1–6 (0/7 — A–G)
Step 12: Panel detection — site layouts, materials, revision (0/7 — A–G, Bug 63/87)
Graduation: Full
cv-menu-validation.au3(0/7 — depends on Steps 0–12)G-series trace campaign (Mar 14–21): G1+G2+G3 = 10/10 unique progcont values traced on VM108. All menu routes dispatch correctly. Sub-dialog validation next.
Panel data archaeology (Mar 21–23): Compact
panelNOD dictionary fully decoded. 14 sections, 230 rows, JSON schema v1 + decoder mapping published. Byte-identical across VM102/104/108.Site data archaeology (Mar 24): Compact
siteNOD dictionary fully decoded. 8 sections (ms, gx, gy, wl, sl, sc, tc, rc), 537 total rows. Key finding: DXF code-2 values are strings requiring(read)parse. JSON schema + decoder mapping published.
Note: Step 7 passed on all TB11 configs (D/E/F/G) — progcont routing works for all 11 menu items tested. Items 4 and 6 show behavioral differences (PB11 VLX has save-first intercept logic not in recovered source). Steps 8–12 test submenu items which also rely on progcont routing.
Week 9 (Mar 14–21): G0/G1/G2/G3 Trace Campaign — COMPLETE¶
The G-series campaign instruments AutoCAD’s LISP environment to capture a function-level call trace for every progcont value across all reachable menu paths. G1 opens CSB001.dwg and fires all 8 progcont values that operate on panel drawings. G2 opens CSBsite1.dwg and fires the site progcont values. G3 covers the 5 remaining gaps (Create New Project from Drawing1, Slope Calculator on panel, Edit Existing Drawing, Create New Drawing, Batch Utilities). G0 captures the cold-start state.
Purpose: Map the actual execution path for each menu item — which functions are called, in what order, with what state — to locate the implementation gaps between TB11 source-mode and PB11 VLX behavior.
Result: All 10 unique progcont values (17 menu items) now have runtime trace evidence. Every menu route dispatches correctly and opens its expected dialog. Internal dialog components (file operations, batch processing, template workflows) remain untested — that’s the next phase.
Key results (March 21, 2026 — G1+G2+G3 complete, Config E / VM 108):
Script |
Drawing |
progcont Values |
Run |
Result |
|---|---|---|---|---|
G1 |
CSB001 (panel) |
1, 262209, 262273, 262401, 262465, 262657, 263169, 264193 |
|
✅ 8/8 |
G2 |
CSBsite1 (site) |
1, 8193, 262209, 262273, 262401, 262465, 262657, 263169, 264193 |
|
✅ 9/9 |
G3 |
Drawing1→CSB001 |
262153, 8193 (panel), 262145, 262161, 262177 |
|
✅ 5/5 |
Bugs fixed during this campaign: 82–92 (see Bug Tracker — Validation Campaign). Key fixes:
Bug 86+89: PANATT
bad argument type: stringp nil— panelvar nil in fresh session (4630e005/cd63f6c8)Bug 87: Materials List
quit/exit abort— ssget failure (cd63f6c8)Bug 88: G2 CSBsite1 XREF timing — WinWait 60s + Sleep 20s for 59 XREFs (
98914a74)Bug 90: View Select Layers wrong dialog type on panel (
cd63f6c8)Bug 91: Print operations
Cannot find layer "custom"— tblsearch guard (7f1bbfba)Bug 92: No-project guard — blocks progcont > 1 when no project loaded, excepts 262153 (
a9146d1f)
G3 detail (5 remaining gaps):
progcont |
Menu Item |
Context |
Dialog Shown |
Notes |
|---|---|---|---|---|
262153 |
Create New Project |
Drawing1 (no project) |
|
Bug 92 exempt — only value allowed without project context |
8193 |
Slope Calculator |
CSB001 (panel) |
|
G2 tested on site; G3 confirms same dialog on panel |
262145 |
Edit Existing Drawing |
CSB001 |
|
getfiled shown; internal file-open untested |
262161 |
Create New Drawing |
CSB001 |
|
getfiled shown; template copy untested |
262177 |
Batch Utilities |
CSB001 |
|
DCL dialog shown; batch operations untested |
Assessment: All dialogs open and show their expected options. However, most dialogs have broken internal components that will each need dedicated sub-dialog validation. The G-series proves the routing layer (progcont dispatch → function → dialog) is correct; the functional layer (what happens after the user interacts with the dialog) is the next frontier.
Trace infrastructure: Scripts (cv-trace-g1.au3, cv-trace-g2.au3, cv-trace-g3.au3) are deployed to C:\CV-Validation\scripts\ on each VM — they are not in the git sparse checkout. Deploy updates via SCP only. Bugs 82–85 fixed trace infrastructure issues (see tracker).
Week 10 (Mar 21–23): Panel Data Archaeology — COMPLETE¶
Objective: Fully decode the ConstructiVision compact panel Named Object Dictionary (NOD) structure, map all fields to the authoritative 152-entity list, and produce a JSON schema + decoder mapping for CV-Web import planning.
Result: Compact panel dictionary fully decoded. 14 sections extracted, 230 total data rows, 2930 flat properties per panel. Cross-VM consistency confirmed byte-identical across VM102 (XP/PB11), VM104 (XP/XP-TEST), and VM108 (Win10x32).
Work products (committed fc1d63d32):
Artifact |
Location |
Description |
|---|---|---|
|
|
Temporary extraction helper — dual export: |
|
|
Raw text dump helper for section inspection |
|
|
Combined flat+sections JSON schema (draft 2020-12) |
|
|
Strict flat profile for validating |
|
|
Tuple-index → field-name map for all 14 sections |
|
|
3-bucket comparison: 79/152 authoritative entities covered |
|
|
3 timestamped export runs with per-VM JSON + analysis |
Dictionary structure confirmed:
Key:
panelin Named Object Dictionary (notpanel_list— that is absent in all test drawings)Layout: DXF group code 1 = section tag, group code 2 = LISP-format payload string
14 sections:
bo, bp, ch, dl, dr, fs, lt, mp, pl, pp, rv, sd, so, wcRow counts (AAA001.dwg): mp=1, so=24, bo=30, dr=6, dl=6, pl=24, lt=12, ch=1, fs=48, sd=6, wc=48, pp=8, bp=6, rv=10 (total: 230)
Authoritative entity coverage (vs docs-developer/panel-entities.md):
Bucket |
Count |
Notes |
|---|---|---|
Exact runtime match |
28/152 |
Tag names appear verbatim in section payloads |
Schema-aligned match |
79/152 |
Mapped via section prefix + tuple position |
Unresolved |
73/152 |
Missing from test drawing or require multi-panel corpus |
Runtime bugs resolved this campaign:
VLA-GET-NAMEDIMAGEPREFS— removed COM path entirely (not available pre-vl-load-com)fboundpundefined on VM104 — removed; used plainreadincvx-parse-payloadbad argument type: stringp nil— nil-key guard in JSON serializer addedrowCount = 0— routing fixed tocvx-extract-panels-from-compactfor compactpanelkeyrowCount = 1.00float format — INT/REAL type-aware serializer:itoafor INT,rtos 2 6for REAL
Known architectural finding: The CSV.VLX was compiled from source code not in this repository. In VLX mode, progcont routing and the md_dlg dialog use a numeric-key DCL with powers-of-2 bitmasks. The recovered source uses a different string-key DCL. The panel compact dictionary layout was reverse-engineered entirely from runtime evidence — no VLX source was available.
Next: Extend archaeology to a site drawing — DONE. See Week 10.5 below.
Week 10.5 (Mar 24): Site Data Archaeology — COMPLETE¶
Objective: Decode the ConstructiVision compact site Named Object Dictionary (NOD) structure using the same methodology proven on the panel dictionary.
Result: Compact site dictionary fully decoded. 8 sections extracted, 537 total data rows. Validated on VM102 (AAAsite1.dwg).
Work products (committed 009f6f3e):
Artifact |
Location |
Description |
|---|---|---|
|
|
Site extraction helper — dual export: |
|
|
Combined flat+sections JSON schema |
|
|
Strict flat profile for validating |
|
|
Tuple-index → field-name map for all 8 sections |
|
|
Debug traces, raw dumps, and validated JSON exports |
|
|
Per-VM site extraction outputs |
Dictionary structure confirmed:
Key:
sitein Named Object Dictionary (notsite_list—site_listabsent in test drawings)Layout: DXF group code 1 = section tag (string), group code 2 = LISP-format payload (string)
8 sections:
ms(master site),gx(grid X),gy(grid Y),wl(wall lines),sl(slabs),sc(schedules),tc(tilt-up),rc(rebar/connections)Row counts (AAAsite1.dwg): ms=1, gx=60, gy=60, wl=160, sl=64, sc=64, tc=64, rc=64 (total: 537)
Critical finding — DXF string encoding:
DXF codes 1–9 in AutoCAD XRECORDs are always stored as strings, not native LISP types. This means code-2 payloads like ((1 ("site1" 0 120.001 240.001 1.001 64))) are string representations, not actual lists. The cvxppnl.lsp panel script already handled this correctly via cvx-parse-payload which calls (read payload). The site script initially lacked this, causing 0-row extraction despite the raw data being present. Fix: added (if (= (type rows) 'STR) (setq rows (read rows))) to cvxs-parse-triplets.
Deployment lesson learned: SCP to paths with spaces (C:/Program Files/ConstructiVision/) silently fails on Windows XP VMs. Deploy to C:/cvxpsite.lsp (root) or use quoted remote paths with single-quote wrapping: scp file 'user@host:"C:/Program Files/path"'. Build fingerprinting (cvxs-build-id) was essential to diagnose stale-file loading.
Next: Multi-panel/multi-site corpus export to extend entity coverage.
Weeks 13–15 (Apr 7–15): Headless DXF Parity Testing — FAIL (corrected)¶
Objective: Achieve full drawing-operation parity (G5) by running TB11 source-mode drawpan headlessly via accoreconsole.exe on AutoCAD 2027 x64, then comparing the DXF output entity-by-entity against a golden baseline exported from the original PB11 VLX-drawn panel.
Result: DXF PARITY FAIL — 97/350 entities (28%). Prior “447/447 PASS” was invalid.
Warning
Prior result invalidated (April 15, 2026). The earlier claim of “447/447 entity parity” was based on a mangled golden baseline. CSB001.dwg had been overwritten from 183KB to 88KB by repeated drawpan runs. The golden DXF was exported from this mangled file, so both golden and test had the same (wrong) entity set. The DXF comparison tool also only checked entity counts by TYPE|LAYER, not actual coordinate values.
Corrections applied:
CSB001.dwg restored to original (183,286 bytes, commit acc0079cd3, Feb 26 2026)
Golden DXF re-exported from restored original: 529,641 bytes, 350 entities (256 in modelspace)
cv-dxf-compare.pyrewritten for full entity-by-entity comparison (every group-code value, not just counts)Run-ParityTest.ps1created to copy original before test, preventing future mangling
Test infrastructure created:
Artifact |
Location |
Purpose |
|---|---|---|
|
|
Headless test harness — loads all 107 modules, runs panatt + drawpan, exports snap + DXF |
|
|
AutoCAD script file to drive headless test via accoreconsole |
|
|
Golden baseline DXF exporter — exports from original VLX-drawn drawing without re-running drawpan |
|
|
Script to drive golden export via accoreconsole |
|
|
Raw DXF parser and comparator — no ezdxf dependency; compares every entity group-code value for full identity (upgraded from count-only Apr 15) |
|
|
Test runner: copies original drawing to work dir, runs drawpan on copy, compares against golden, verifies original unchanged |
|
|
Golden baseline (529,641 bytes, 350 entities from restored original CSB001.dwg) |
|
|
Test output for comparison |
Execution command:
# Headless test (accoreconsole)
Copy-Item scripts\cv-auto-draw.lsp src\x64\TB11-01x64\cv-auto-draw.lsp -Force
& "C:\Program Files\Autodesk\AutoCAD 2027\accoreconsole.exe" `
/i "src\Project Files\ConstructiVision Sample Building\CSB001.dwg" `
/s "scripts\cv-auto-test.scr" 2>&1 | Tee-Object -FilePath reports\auto-test\runN.log
# DXF comparison
python scripts\cv-dxf-compare.py reports\golden\CSB001-golden.dxf reports\auto-test\CSB001-snap.dxf
Bugs fixed during headless parity campaign (Bugs 93–122+):
Bug |
Summary |
Fix |
|---|---|---|
Compact item toggle decoder |
panatt.lsp only decoded master toggles, not individual item toggles |
Added item-level decoder to panatt.lsp |
panelvar save timing |
panelvar not re-saved after item decoder ran |
Re-save panelvar after compact decode |
Layer off Y/N prompt |
AutoCAD prompts “Do you really want to turn the current layer off?” |
Added “Y” response in opening.lsp, chamfer.lsp, plt.lsp, nbblock.lsp |
TRIM hang (AutoCAD 2027) |
TRIM enters unbreakable crossing-window mode when pick misses |
Skipped in headless mode with |
Block redefine Y/N |
|
|
INTERFERE native crash |
INTERFERE crashes Core Console (C++ level, not LISP error) |
Skipped in headless mode — SUBTRACT does the real boolean work |
dlvar invalid window spec |
dlvar drawing section threw invalid window specification |
Error isolation with vl-catch-all-apply |
DXF comparison results (CSB001, corrected golden from restored original):
Metric |
Golden |
Test |
Match |
|---|---|---|---|
Total DXF entities |
350 |
97 |
❌ 28% |
Entity type categories |
59 |
2 match |
❌ 3% |
DIMENSIONs |
20 |
0 |
❌ 0% |
HATCHes |
16 |
0 |
❌ 0% |
ATTRIBs |
73 |
0 |
❌ 0% |
Annotation INSERTs |
29 |
0 |
❌ 0% |
3D geometry (solids) |
12 |
44 |
⚠️ Partial (geometry produced, counts differ) |
What TB11 produces: Main panel solid, feature extrusions (8), greenplate, J-bolt hardware (34 3DSOLID + 32 INSERT), perimeter outline (2 LWPOLYLINE), basic text labels (5 TEXT + 4 MTEXT).
What TB11 is missing: ALL dimension annotations (DIMENSION entities, leader LINEs, dimension TEXTs, dimension SOLIDs/arrows), ALL hatches (feature and hardware), ALL block INSERT annotations with ATTRIBs (connections, points, features, perimeter markers). The finpan/drawdim annotation pipeline is the primary gap.
Known headless limitations (secondary to the annotation gap):
TRIM skipped: TRIM in AutoCAD 2027 enters crossing-window mode when the pick point misses geometry. Six LISP-level approaches failed (TRIMEXTENDMODE=0, ESC cancel, split command, vl-cmdf, ssget pre-check, vl-catch-all-apply). Skipped with
(if (not csv_headless) ...)guard. TRIM is cosmetic — it trims panel edges at openings.INTERFERE skipped: INTERFERE is a visual validation tool (highlights overlapping solids). It native-crashes
accoreconsole.exe(generates minidump). SUBTRACT (the real boolean operation) works correctly. Skipped with headless guard.
Assessment: G5 headless parity is NOT achieved. TB11 drawpan produces 3D geometry correctly but the annotation/dimensioning pipeline (finpan + drawdim + mkblk) is not generating annotations. The gap is 253 entities (350−97). Primary areas to fix: drawdim (dimensions), finpan annotation calls, hatch generation, and block-insert annotations with ATTRIBs. TRIM and INTERFERE headless limitations remain secondary issues.
P-Series: Sub-Dialog Visual Comparison (March 2026 — PLANNED)¶
The G-series confirmed all 17 menu routes dispatch correctly. The P-series goes one level deeper: capturing and comparing every sub-dialog’s visual layout against the golden PB11/VLX baseline to catch misaligned controls, wrong control types, missing elements, and layout drift that OCR text comparison misses entirely.
Warning
OCR is insufficient for dialog validation. OCR catches wrong text but cannot detect: control misalignment, wrong control types (edit_box vs popup_list), missing sliders, shifted radio buttons, wrong tab order, or different dialog dimensions. TB11 source-mode DCL dialogs are known to have layout issues vs PB11 golden. The P-series uses pixel-level visual comparison to catch these.
Methodology: Golden Baseline → Pixel Diff¶
Golden baseline: VM 102 (PB11 VLX) — the compiled VLX produces the definitive dialog layout. Existing golden screenshots in reports/ocr-output/vm102-final/ cover the 11 top-level dialogs. The P-series will extend this to all 21 panel sub-dialogs + 3 site sub-dialogs + utility dialogs.
Comparison tool: scripts/compare-screenshots.py (Pillow-only, no new dependencies)
# Compare golden (VM 102 PB11) against test (VM 108 TB11)
python scripts/compare-screenshots.py reports/p-series/golden/ reports/p-series/vm108-tb11/
# Output: diff images (red=mismatch), overlays, side-by-side, similarity report
Per-pair outputs:
<name>-diff.png— White pixels = match, red pixels = mismatch (intensity = magnitude)<name>-overlay.png— 50% transparent golden overlaid on test (shows alignment drift)<name>-side.png— Side-by-side with labelscomparison-report.txt— Similarity scores with PASS/FAIL per dialog
Pass criteria: ≥95% pixel match within tolerance (12/255 per channel, accounts for font rendering differences between XP and Win10). Auto-crops to dialog region to ignore the AutoCAD drawing background.
P-Series Capture Plan¶
Phase P1 — Panel Feature Sub-Dialogs (21 dialogs, ~2,237 controls):
Capture each sub-dialog by opening CSB001, entering mp_dlg (New Panel), and toggling each feature on to show its sub-dialog. Capture on both golden (VM 102 PB11) and test (VM 108 TB11).
Dialog |
DCL File |
Controls |
Slots |
Capture Method |
|---|---|---|---|---|
mp_dlg |
mp_dlg.dcl |
49 |
— |
Open via md_dlg → New Panel |
ro_dlg |
ro_dlg.dcl |
61 |
4 openings |
mp_dlg → Openings toggle |
wd_dlg |
wd_dlg.dcl |
207 |
12 × 2pg |
mp_dlg → Windows toggle |
dr_dlg |
dr_dlg.dcl |
57 |
4 doors |
mp_dlg → Doors toggle |
dl_dlg |
dl_dlg.dcl |
49 |
3 levelers |
mp_dlg → Dock Leveler toggle |
sb_dlg |
sb_dlg.dcl |
103 |
6 blockouts |
mp_dlg → Blockouts toggle |
rb_dlg |
rb_dlg.dcl |
55 |
6 round |
mp_dlg → Round Blockout toggle |
nb_dlg |
nb_dlg.dcl |
157 |
6 non-rect |
mp_dlg → Non-Rect toggle |
fh_dlg |
fh_dlg.dcl |
480 |
19 × 3pg |
mp_dlg → Features toggle (horiz) |
fv_dlg |
fv_dlg.dcl |
366 |
19 × 3pg |
mp_dlg → Features toggle (vert) |
pl_dlg |
pl_dlg.dcl |
34 |
3 pilasters |
mp_dlg → Pilaster toggle |
ll_dlg |
ll_dlg.dcl |
40 |
3 lintels |
mp_dlg → Lintel toggle |
tp_dlg |
tp_dlg.dcl |
13 |
1 plate |
mp_dlg → Top Plate toggle |
lb_dlg |
lb_dlg.dcl |
85 |
4 bars |
mp_dlg → Ledger toggle |
ch_dlg |
ch_dlg.dcl |
30 |
4 edges |
mp_dlg → Chamfer toggle |
ts_dlg |
ts_dlg.dcl |
15 |
2 steps |
mp_dlg → Top Steps toggle |
fs_dlg |
fs_dlg.dcl |
15 |
2 steps |
mp_dlg → Footing Steps toggle |
ss_dlg |
ss_dlg.dcl |
19 |
2 seats |
mp_dlg → Spandrel toggle |
sd_dlg |
sd_dlg.dcl |
28 |
4 dowels |
mp_dlg → Slab Dowels toggle |
pp_dlg |
pp_dlg.dcl |
42 |
8 points |
mp_dlg → Pick Points toggle |
bp_dlg |
bp_dlg.dcl |
34 |
4 braces |
mp_dlg → Brace Points toggle |
wc_dlg |
wc_dlg.dcl |
615 |
15 × 5pg |
mp_dlg → Weld Connections toggle |
Phase P2 — Site Sub-Dialogs (3 + supporting):
Dialog |
DCL File |
Controls |
Capture Method |
|---|---|---|---|
grid_dlg |
grid_dlg.dcl |
231 |
site_dlg → Grid Lines |
wall_dlg |
wall_dlg.dcl |
711 |
site_dlg → Wall Lines |
slab_dlg |
slab_dlg.dcl |
648 |
site_dlg → Slab Lines |
Phase P3 — Multi-Page Dialogs:
Dialogs with multiple pages need each page captured separately:
wc_dlg: 5 pages (pg1–pg5) + wc_edit
fh_dlg / fv_dlg: 3 pages each + template
wall_dlg / slab_dlg: 4 pages each
wd_dlg: 2 pages
Phase P4 — Batch/Export/Utility:
Dialog |
DCL File |
Controls |
Capture Method |
|---|---|---|---|
btch_dlg |
btch_dlg.dcl |
22 |
md_dlg → Batch Utilities |
dreng_dlg |
dreng_dlg.dcl |
25 |
btch_dlg → Export → Dayton |
mbeng_dlg |
mbeng_dlg.dcl |
23 |
btch_dlg → Export → Meadow-Burke |
wsbeng_dlg |
wsbeng_dlg.dcl |
25 |
btch_dlg → Export → WSB |
calc_dlg |
calc_dlg.dcl |
27 |
Menu → Slope Calculator |
Golden Capture Workflow¶
VM 102 (PB11 VLX): Open CSB001, navigate to each sub-dialog, screenshot each one. Store in
reports/p-series/golden/VM 108 (TB11 source): Same steps, same drawing. Store in
reports/p-series/vm108-tb11/Run comparison:
python scripts/compare-screenshots.py reports/p-series/golden/ reports/p-series/vm108-tb11/Review diff images: Red areas = layout differences. Triage as:
DCL fix needed — control misalignment fixable in
.dclsourceLSP fix needed — wrong default values or missing
set_tilecallsWAD — acceptable difference between VLX and source mode (document and accept)
Goal: Clone Golden Layout¶
The target is visual parity between TB11 source-mode dialogs and PB11 VLX-mode dialogs. Where the recovered .dcl source doesn’t match the VLX’s embedded DCL, the source DCL must be corrected to reproduce the golden layout. This is a progressive effort — start with the highest-priority dialogs (mp_dlg, wc_dlg, wd_dlg) and work down.
Full Test Plan Status & Completion Projections (March 5, 2026)¶
Completed Layers¶
Layer |
Description |
Items |
Done |
Status |
|---|---|---|---|---|
Layer 0 |
OS compatibility baseline (5 platforms × 10 tests) |
45 |
40 |
89% — only Uninstaller row (5) remains |
Layer 1.5 |
VLX compilation verification |
5 |
5 |
100% ✅ |
Layer 4 |
UI reference documentation (doc 31) |
1 |
1 |
100% ✅ |
Sprint Wk 1–6 |
Checklist items |
34 |
32 |
94% — 2 deferred (v7.0 installer script, audit logging) |
Active Layer: P0 Incremental Baseline¶
Step |
Description |
Configs |
Done |
Status |
|---|---|---|---|---|
0 |
AutoCAD launch |
7 |
7/7 |
✅ |
1 |
Window layout |
7 |
7/7 |
✅ |
2 |
CV initialization |
7 |
7/7 |
✅ |
3 |
Drawing open + CSV behavior |
7 |
6/7 |
✅ (B–G PASS, A skipped) |
4–6 |
Progcont decomposition |
— |
— |
✅ Covered by Step 7 (subsets of 11-item sweep) |
7 |
Simple menu items |
7 |
6/7 |
✅ (B–G PASS, A not tested) |
8 |
3D Viewpoint submenu |
7 |
5/7 |
✅ (B, D–G PASS, A/C not tested) |
9 |
View Layers submenu |
7 |
4/7 |
📌 PINNED — multi-story broken, needs test drawing |
10 |
Shading submenu |
7 |
5/7 |
✅ (B, D–G PASS, A/C not tested) |
11 |
Print submenu (panel) |
7 |
0/7 |
⬜ |
12 |
Panel detection (Bug 63/87) |
7 |
0/7 |
⬜ — Bug 87 fixed; needs live re-verification |
Grad |
Full au3 |
7 |
0/7 |
⬜ |
P0 Total |
40/56 |
Steps 0–3, 4–6, 7–10 validated; 9 pinned; 11–12 next |
Next Phase: Sub-Dialog Feature Validation (March–April 2026)¶
The AutoIT automation covers the 6 top-level menu entries. The remaining 59 of 65 workflows from 31 — Comprehensive Workflow & Human Factors Analysis Section 6 require manual or extended AutoIT testing:
Priority 1 — Critical Workflows (must pass before App Store release):
P-02: Create simple panel with dimensions → verify DWG output
P-03: Panel with openings (windows subtracted from solid)
P-13: Weld connections (wc_dlg — 5 pages, 615 controls)
P-14/P-15: Pick points and brace points (block placement)
P-20: Panel with ALL features enabled simultaneously
P-21: Edit existing panel → modify → regenerate
B-01/B-02: Batch print all / batch redraw all
B-07: Print materials list → verify
matlist.txtoutputE-01/E-02: Dayton/Richmond + Meadow-Burke export → verify CSV/file output
S-01: Create site drawing → verify blank site
U-06: Registration manager (license activation flow)
Priority 2 — High Workflows:
P-04/P-05: Panels with doors / blockouts
P-10: Panel with chamfer (edge chamfers)
P-22: Copy panel as template
S-02: Grid lines (grid_dlg — 208 edit boxes)
S-05: Attach panels to site drawing
E-04/E-05: Import .pan / .pnl files
U-03: WC Edit (connection library update)
Priority 3 — NanoCAD + AutoCAD 2026 Compatibility:
Verify CV .lsp loading in NanoCAD 25 on VM 109
Test panel creation workflow in NanoCAD 25 (LSP-only, no COM)
Test CV source-mode loading in AutoCAD 2026 on VM 109
Document NanoCAD-vs-AutoCAD behavioral differences
Validate 150-file
nanocad25/fork against 187-fileacad2000/set
Priority 4 — VLX Compilation & Distribution:
Compile TB11 source →
csv.vlxonce sub-dialog validation stabilizesPackage as
.bundlefor Autodesk App Store (see Release and Distribution Plan (App Store First))Test
.bundleinstallation on clean VMCreate NanoCAD distribution package (LSP-only, no VLX)
Priority 5 — GitHub Native Fleet & Deployment Integration:
GitHub provides several native features that could replace or augment the current custom tooling. These are future enhancements, not blockers for the March 2026 release.
GitHub Environments — Define named environments (VM102-XP, VM104-XP, VM108-Win10, etc.) with deployment history. Shows which VM has which commit deployed. Native feature, no custom code needed beyond workflow configuration.
Deployment API — After
CV Update.batpulls, POST a deployment status to GitHub (commit hash, VM name, success/fail). Shows on the repo’s “Deployments” tab. Win10 VMs can useghCLI directly; XP VMs would needcurlwith a PAT token or relay through dev machine.GitHub Check Runs — Attach validation test results (OCR summaries) to specific commits. Shows pass/fail inline on commit pages. Requires a GitHub App or PAT-based API call after validation runs.
GitHub Projects (v2) — Kanban/table board for “Fleet Status” or “Validation Results” with custom fields and automation. Manual unless scripted.
Self-hosted Actions runners — Install runner agent on Win10 VMs (108, 109, 201, 202) to execute workflows on push and report online/offline status. Requires .NET Core — NOT viable for XP VMs (102, 103, 104).
Note
XP limitation: GitHub CLI (gh), self-hosted runners, and modern HTTPS certificate chains all require runtimes unavailable on Windows XP. XP VMs (102–104) will always need relay through the dev machine or a proxy VM for API-based reporting. Win10 VMs (108, 109, 201, 202) can use these features directly.
Deliverables¶
Completed (Sprint Jan 21 – Feb 28):
✅
reports/ocr-output/vm{102,103,104}-final/— AutoIT validation screenshots + logs✅
reports/ocr-output/*/ocr-output/summary.txt— OCR text extraction✅
scripts/cv-menu-validation.au3— Automated UI validation script (572 lines)✅
scripts/ocr-screenshots.py— BMP→PNG + Tesseract OCR pipeline✅ 31 — Comprehensive Workflow & Human Factors Analysis — 45-row cross-reference + 65-workflow test matrix + 44-dialog inventory
✅ Bug Tracker — Validation Campaign — live bug inventory with DFMEA cross-references
✅ Software Development Lifecycle — SDLC process (1,007 lines)
Remaining:
⏳ Sub-dialog feature validation report (59 workflows from doc 31 Section 6)
✅ Site drawing data archaeology (
site/site_listNOD key decode) — COMPLETE (Mar 24)❌ Headless DXF parity (G5 drawpan CSB001) — FAIL (Apr 15; 97/350 entities, 28%; prior 447/447 claim invalid)
⏳ Fix annotation/dimensioning pipeline (finpan, drawdim, hatches, block-inserts)
⏳ Multi-panel corpus export (extend 79/152 entity coverage to full 152)
⏳ NanoCAD 25 compatibility report
⏳ AutoCAD 2026 compatibility report
⏳ VLX compilation report (deferred until source-mode validation complete)
⏳
.bundlepackaging for Autodesk App Store