The core console would be a great help to speed up certain tasks, but I want to test commands including the selection by pick or by window.
Here is the solution I came up with. So far I have only two little tests and each test doesn't do much. But it works and it's promising. I would like any comment, "it's great", "it sucks" or "you are reinventing the wheel" are all welcome (I really hope to get the third one).
The test suite uses an iterator to go through all the subfolders of the Tests folder. Each subfolder represents a test consisting in an AutoCAD session that executes a script. So if I want to add a test, I just need to create a new subfolder, there is no need to add code to the test suite.
Here are 2 test folders, JointLines and JointLines2:

Each folder contains the following:
- One or more start dwg file, usually one
- One or more reference dwg files, usually many
- The file Script.scr, containing the steps that the test must execute
- The file TestConfiguration.json, containing some info used by the test suite. So far I have only one "KillAfterSeconds" parameter that tells the test suite to kill AutoCAD if it doesn't complete the test quickly enough
- The file RunTest.cmd. This is not required, but it's handy to start AutoCAD with the script just like the test suite will do
- The file Log.txt. This is deleted when the test suite starts, and it's created when the test runs
The file Script.scr opens a start drawing, executes some commands on it, then executes the "CompareWithReferenceDrawing <reference dwg path>" command, then repeats either with the same start drawing and different commands and reference drawings or with different start drawings. The "CompareWithReferenceDrawing" command compares all the entities in the current drawing (that is the result of the commands executed in the start file by the script) with the entities in the reference drawing, and appends the result of the comparison to Log.txt. The test suite looks at Log.txt to report any test failure.
The tests are executed by the following test suite:
using System.Collections;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Text.RegularExpressions;
using System.Threading;
using NUnit.Framework;
using Newtonsoft.Json;
namespace Tests;
[TestFixture]
public class CommandsFromFoldersTests
{
private const string CoreConsolePath = "C:/Program Files/Autodesk/AutoCAD 2023/accoreconsole.exe";
private const string AutoCadPath = "C:/Program Files/Autodesk/AutoCAD 2023/acad.exe";
private const string DirectoryWithTestsDirectories = "C:/workspace/AutoCAD/IntelliClad/Tests/Tests";
private static IEnumerable TestCases
{
get
{
foreach (var directory in Directory.GetDirectories(DirectoryWithTestsDirectories))
yield return new TestCaseData(directory);
}
}
[Test, TestCaseSource(nameof(TestCases))]
public void CommandFromFolderTest(string folderPath)
{
var configurationFile = $"{folderPath}/TestConfiguration.json";
var scriptFile = $"{folderPath}/Script.scr";
var logFile = $"{folderPath}/Log.txt";
var jsonString = File.ReadAllText(configurationFile);
var testConfiguration = JsonConvert.DeserializeObject<Dictionary<string, object>>(jsonString);
var killAfterSeconds = (int)(long)testConfiguration["KillAfterSeconds"];
// delete log and done files
if (File.Exists(logFile)) File.Delete(logFile);
// prepare arguments for starting AutoCAD and running the script
var startInfo = new ProcessStartInfo
{
FileName = AutoCadPath,
Arguments = $"/b \"{scriptFile}\"",
UseShellExecute = false,
RedirectStandardOutput = true,
RedirectStandardError = true
};
// start AutoCAD
using (var process = new Process())
{
process.StartInfo = startInfo;
process.EnableRaisingEvents = true;
process.Start();
Assert.IsNotNull(process, $"Failed to start {AutoCadPath}");
// setup a timer to kill the process after killAfterSeconds seconds
var timer = new Timer((state) =>
{
if (!process.HasExited)
{
process.Kill();
Assert.Fail($"AutoCAD was killed because the test was not completed within {killAfterSeconds} seconds.");
}
}, null, killAfterSeconds * 1000, Timeout.Infinite);
// wait for process to complete
process.WaitForExit();
timer.Dispose();
}
// fail the test if the log file contains a line with "=== ERROR ==="
// (created by CompareWithReferenceDrawing)
Assert.IsTrue(File.Exists(logFile), "Missing log file.");
var testResult = File.ReadAllText(logFile);
var match = Regex.Match(testResult, @"=== ERROR ===\n(.*)", RegexOptions.Multiline);
if (match.Success)
Assert.Fail(match.Groups[1].Value);
}
}
Below is an example of a Script.scr file.
This script opens Start.dwg, starts the SetJointLine command, picks two points (that is selects two entities), then sets the joint size to 1. Repeats again with two different points and a different joint size. Then starts the command CompareWithReferenceDrawing which compares the current drawing (as is, after running SetJointLine twice) with the reference drawing SetJointLine2NewValues.dwg.
Then the script closes the file, reopens it, executes the same commands but with different joint sizes and compares the result with SetJointLine1New1ExistingValue.dwg.
Then the script closes AutoCAD.
;============================================
;test SetJointLine with two new joint size values
;============================================
Open
"C:/workspace/AutoCAD/IntelliClad/Tests/Tests/JointLines/Start.dwg"
Zoom
E
SetJointLine
244,-110
244,-85
1
SetJointLine
244,-60
244,-40
2
CompareWithReferenceDrawing
"C:/workspace/AutoCAD/IntelliClad/Tests/Tests/JointLines/SetJointLine2NewValues.dwg"
Close
Y
;============================================
;test SetJointLine with one new and one existing joint size values
;============================================
Open
"C:/workspace/AutoCAD/IntelliClad/Tests/Tests/JointLines/Start.dwg"
Zoom
E
SetJointLine
244,-110
244,-85
0.375
SetJointLine
244,-60
244,-40
0
CompareWithReferenceDrawing
"C:/workspace/AutoCAD/IntelliClad/Tests/Tests/JointLines/SetJointLine1New1ExistingValue.dwg"
Quit
Y
The last two pieces of the puzzle are the commands CompareWithReferenceDrawing and SelectEntityByDrawingNameAndHandle. I have two implementations that are good enough for a proof of concept (i tested the test suite and I get success or failure as expected), but nothing that I can show yet.
CompareWithReferenceDrawing creates a list of entities from each of the drawings and compares them. The comparison considers points identical when they are almost identical, considers some entity properties like color or layer and ignores others. The entity lists from the two drawings are sorted so that two drawings that "look" the same are considered the same, even if their entities appear in different order. A change in an algorithm could cause the order entities are generated to change, which shouldn't cause the test to fail.
When two entities are different, one line containing "=== ERROR ===", one line with the description of the difference, something like "Line has wrong layer" and one line containing "SelectEntityByDrawingNameAndHandle <dwg name> <entity handle>" are added to the log file.
SelectEntityByDrawingNameAndHandle opens a file and zooms to the specified entity. I can select a line from the log file, paste it in AutoCAD command line and the drawing will open and the entity will be selected and zoomed to.
Here is an example of the log file with two failures (the test suite will only report the first one):
=== ERROR ===
Line with different start point: (315.xxx-xxxxxxxx,-99.722123995678,0) - (223.833839706771,-27.722123995678,0)
SelectEntityByDrawingNameAndHandle C:\workspace\AutoCAD\IntelliClad\Tests\Tests\JointLines\Start.dwg: 2928
SelectEntityByDrawingNameAndHandle C:\workspace\AutoCAD\IntelliClad\Tests\Tests\JointLines\SetJointLine2NewValues.dwg: 2954
=== ERROR ===
Different color: 255,0,0 - BYLAYER
SelectEntityByDrawingNameAndHandle C:\workspace\AutoCAD\IntelliClad\Tests\Tests\JointLines\Start.dwg: 2908
SelectEntityByDrawingNameAndHandle C:\workspace\AutoCAD\IntelliClad\Tests\Tests\JointLines\SetJointLine1New1ExistingValue.dwg: 2908
Here is the test result of the test suite with two folders:
