Skip to content

drivers: stepper: tests: Test Generalization #85658

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

jbehrensnx
Copy link
Contributor

This PR includes the test generalization discussed in PR #82843.

This PR has 3 objectives, first to offer a set of test suites for stepper drivers that can be used to help develop new drivers. The second objective is to offer a specification of the expected API behavior, something that is currently missing. And thirdly, to offer a tool to evaluate new drivers before approval.

While several stepper drivers have been developed, testing was often done only haphazardly, primarily using the stepper shell, such as PR #79120. Testing via the shell is not only inconvenient and time consuming, it is also unreliable and non-repeatable, leading to many bugs and other issues to slip through. This set of test suites counteracts this by offering a set of repeatable tests that cover a wide range of expected behavior, making development easier and improving driver quality.

The stepper API currently lacks a specification for expected behavior. This not only makes driver development more difficult, as it is unclear how the new driver should behave in certain situations, it also leads to differences in behavior between drivers. This is in stark contrast to the goals of an API, where the user shouldn't care about the specific implementation and instead each api call should lead to the same result. These tests can alleviate this by defining expected behavior. Drivers that successfully pass them behave the same in the covered areas, improving their interchangeability from the user perspective.

Thirdly, does these test suites offer a way to evaluate new drivers before approval and thus ensure a certain level quality. Ideally, a new stepper should pass all applicable tests before it is approved.

Test Suites

stepper_api

This is a complete replacement of the existing stepper_api test suite. It checks general behavior of the driver and the results of incorrect input.

stepper_const_velocity

This test suite checks that drivers that don't use acceleration and thus run at a constant velocity, move as expected. This includes that move_to and move_by finish in the expected time, and that all three movement commands result in the expected position after a given time.

Current Test Results

All Tests were executed on a Nucleo F767ZI board with the CONFIG_ZTEST_SHUFFLE flag set, ensuring both random order and repetition. For the GPIO Stepper Controller, incorrect spinlock usage results in test_micro_step_res_set_incorrect_value_detection to crash the system. For this driver the test is skipped and considered to fail. The following table gives the test pass rate for both test suites and all tested drivers.

Driver stepper_api stepper_const_velocity
GPIO Stepper Controller 83.33% 100%
DRV8424 Work Queue 88.89% 60%
DRV8424 Counter 100% 100%
TMC2209 Work Queue 50% 60%
TMC2209 Counter 55.56% 100%

Limitations

Like all tests, this set of test suites has its limitations. While the new stepper_api test suite should be applicable to all stepper driver, the stepper_const_velocity, as the name suggests, is only meant for drivers that move at a constant velocity. Drivers that use acceleration, such as tmc50xx or the acceleration step-dir implementation currently under development require their own test suite to evaluate movement.

Another limitation of this set of test suites is that they are limited to evaluating api return values. They can not check the physical behavior. In the case of a step-dir driver for example, they can not check, if the order is given for 100 steps, that the step signal actually send 100 pulses and that they occurred with the right frequency. Twisters support for these kinds of tests is currently limited. In addition, they require a far greater degree of customization for the target driver, going beyond the scope of this PR.

Possible Future Work

  • Fixing existing stepper drivers so that they pass all tests.
  • Development of tests that evaluate actual outputs. Likely to be fairly board/driver specific, but some generalization for step-dir might be possible.
  • Development of tests for drivers that use acceleration. Very high likelihood of being driver specific.
  • Discussion (possibly in this PR) about acceptance criteria for new stepper drivers.

Copy link
Contributor

@faxe1008 faxe1008 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First rough pass, thanks @jbehrensnx amazing effort!

@@ -6,4 +6,7 @@ config ENTROPY_GENERATOR
bool "Shuffle the order of tests and suites"
default y

config SPEED_STEPS
int "Test speed in microsteps/s, also used as number of steps to make for certain tests"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

a bit long and a grammar error, move this to a help text maybe?

config ENTROPY_GENERATOR
depends on ZTEST_SHUFFLE
bool "Shuffle the order of tests and suites"
default y
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks like its reverse - shuffling would depend on the entropy generator no?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technically yes, but it is in that order to ensure that ENTROPY_GENERATOR is automatically selected when shuffling. ZTEST_SHUFFLE having select ENTROPY_GENERATOR is probably the better option

@jbehrensnx
Copy link
Contributor Author

@jilaypandya For some reason you were not included automatically

@jilaypandya
Copy link
Member

Thanks @jbehrensnx great work in unifying the tests :)

The stepper API currently lacks a specification for expected behavior.

Should we not fix the stepper api specification before writing tests, i mean it's probably a change to the API documentation, or?

@jbehrensnx
Copy link
Contributor Author

Thanks @jbehrensnx great work in unifying the tests :)

The stepper API currently lacks a specification for expected behavior.

Should we not fix the stepper api specification before writing tests, i mean it's probably a change to the API documentation, or?

On some level you could say yes. The tests can however help us find cases where the current api specification is insufficient. That said, I would agree that we should complete the specification discussion before merging/approving this PR.

The other question is, should we discuss the api specification in this PR or in a RFC?

@jbehrensnx
Copy link
Contributor Author

  • Fixed and added copyright information
  • Temporarily removed qemu_x86_64 from testcases (overlays remain) as the stepper_api tests currently fail for this platform, thus failing some github checks
  • Fixed wrong expected return value in one test (setting incorrect microstep resolution). This causes 1 test for tmc2209 and drv8424 to fail that previously passed and the inverse to happen for the fixed gpio driver

@jilaypandya
Copy link
Member

Thanks @jbehrensnx great work in unifying the tests :)

The stepper API currently lacks a specification for expected behavior.

Should we not fix the stepper api specification before writing tests, i mean it's probably a change to the API documentation, or?

On some level you could say yes. The tests can however help us find cases where the current api specification is insufficient. That said, I would agree that we should complete the specification discussion before merging/approving this PR.

The other question is, should we discuss the api specification in this PR or in a RFC?

Yes a RFC sounds good 👍

@jilaypandya
Copy link
Member

Please rebase :)

@jbehrensnx
Copy link
Contributor Author

Significant rework, incorporating the changes that have occurred since. Replaced test_config.yaml with tags. Usage of alias instead of chosen, as that is what the stepper_api test suite was using.

@zephyrbot zephyrbot requested a review from jilaypandya March 19, 2025 15:00
Mimxrt1060-EVKB
---------------

NOTE: For this test no real a4979 was used and thus the following pin configuration is
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In overlays via comments we can and do mention the pinout. Please drop this and other tables

Suggested change
NOTE: For this test no real a4979 was used and thus the following pin configuration is

Copy link
Member

@jilaypandya jilaypandya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While i do appreciate this extensive testing, for an api that is still experimental this would mean each time a change is introduced, quite a bit of refactoring will have to be undetaken

int "Test speed in microsteps/s"
default 50
help
Besides being the test speed, this value is also used in several tests as the number of
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Description says otherwise

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we have the moving of files in a separate commit?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can see what I can do, but as squashing everything into one commit was not the last thing I did, it might be difficult.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now lot of different things are squashed into one commit. Let's bring the changes step by step in.
In the first step probably just transfer the existing drv8424_api tests to stepper_api. That would also make reviewing easier and give more context about what is happening.

Copy link
Member

@jilaypandya jilaypandya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think there are some overlays which you have not actually tested on the hardware, can we please not include them? I stick to the example of nucleo_g071rb since i test few drivers that i have with that board.

Also this leaves room for further contributors who want to get their first PR badge :-)

Nucleo F767ZI
-------------

NOTE: For this test no real tmc2209 was used and thus the following pin configuration is
theoretical.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this actually tested on the hardware or is it just suggestive

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The drv8424, tmc2209 and a4979 drivers dont actually need hardware at this point. One can simply define and configure a node for them and then run the tests successfully. Which is what I did for the tmc and allegro.
I defined devicetree nodes for the drivers and tested them on the two boards. The wording is admittedly maybe a bit misleading, as I tested them on actual boards, just without an actual stepper controller.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is basically the same as the native_sim tests, just on a real board and not a simulated one.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated the readme so that this is hopefully clearer.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this some pre-work for the upcoming ramp drivers?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

More or less, plus the tmc50xx driver already exists. As we have drivers with differing movement behavior, separating these tests seemed like a good idea.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so set_micro_step_interval is the only way of setting speed in stepper api. tmc50xx in its current form does not support this, the timing of the steps is decided internally by the IC. One can set the ramp parameters and the rest of the work is taken over by the IC.

so stepper api inherently does not have any functions which would suggest anything other than a fixed step interval ticking i.e. set_micro_step_interval. Hence there is no need to diverge into two different test-suites as of now.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually there is. stepper_api is intended to include tests that all stepper drivers need to be able to pass. This includes tmc50xx. The remaining functionality needs to be tested in other testsuites. This is where stepper_const_velocity comes into play. Another test suite concerning the specific behavior of tmc50xx should also be developed at some point, but that is beyond the scope of this PR.

extra_args:
- platform:native_sim/native/64:DTC_OVERLAY_FILE="boards/native_sim/allegro_a4979.overlay"
- platform:nucleo_f767zi/stm32f767xx:DTC_OVERLAY_FILE="boards/nucleo_f767zi/allegro_a4979.overlay"
- platform:mimxrt1060_evk@B/mimxrt1062/qspi:DTC_OVERLAY_FILE="boards/mimxrt1060_evk_mimxrt1062_qspi_B/allegro_a4979.overlay"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please only add the overlays if they are actually tested on the hardware

NOTE: For this test no real a4979 was used and thus the following pin configuration is
theoretical.

.callback = stepper_print_event_callback,
.callback = stepper_api_print_event_callback,
.base_ms_resolution = CONFIG_BASE_MS_RESOLUTION,
.test_ms_resolution = CONFIG_TEST_MS_RESOLUTION,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where exactly is this getting used?

Copy link
Contributor Author

@jbehrensnx jbehrensnx Mar 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

During the rework I forgot to add the corresponding test.
Edit: Added missing test.

int32_t steps = 100;
bool moving = true;

(void)stepper_set_microstep_interval(fixture->dev, fixture->test_interval);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
(void)stepper_set_microstep_interval(fixture->dev, fixture->test_interval);
(void)stepper_set_microstep_interval(fixture->dev, CONFIG_TEST_MS_RESOLUTION);

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That constant is intended to be used to test that a new microstep resolution can be successfully set. I forgot to add the test during the rework, which will be fixed. Managing the interval via a constant is however something I will do.

(void)stepper_move_by(fixture->dev, steps);
(void)stepper_enable(fixture->dev, false);
(void)stepper_is_moving(fixture->dev, &moving);
zassert_false(moving, "Driver should not be in state is_moving after being disabled");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While this statement is indeed correct, its just a bit handful to read :)

Suggested change
zassert_false(moving, "Driver should not be in state is_moving after being disabled");
zassert_false(moving, "stepper should not be moving once disabled");

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your recommendation is somewhat imprecise. There is a difference between the stepper being in state is_moving versus actually moving (i.e. position counter changing). The new error message could indicate the latter, which is however tested in another test.

(void)stepper_enable(fixture->dev, true);
(void)k_msleep(100);
(void)stepper_is_moving(fixture->dev, &moving);
zassert_false(moving, "Driver should not be in state is_moving after being reenabled");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
zassert_false(moving, "Driver should not be in state is_moving after being reenabled");
zassert_false(moving, "Stepper should not start moving upon being just enabled");

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Being reenabled is actually an important part of the test and thus should occur in the error message.

@jbehrensnx
Copy link
Contributor Author

I will drop the qemu and native sim platforms from the const_velocity testsuite, as they will always fail, and this causes issues with the automated tests.
For stepper_api I will remove the native_sim and qemu platforms from some tests for the same reason, but as in this case the failures are a result of the drivers themself and not timer issues in the simulation, the overlays and everything else will remain, so that they can be reenabled once the issues of the drivers are fixed.

@jbehrensnx
Copy link
Contributor Author

While i do appreciate this extensive testing, for an api that is still experimental this would mean each time a change is introduced, quite a bit of refactoring will have to be undetaken

That is actually a very good reason to write all these tests, as they will help to ensure, that after these changes the api still behaves as expected. It is not quite Test Driven Development, but it borrows at least a few ideas from it.

{
int32_t steps = 1000;
int32_t position_1;
int32_t position_2;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

int32_t position_before_reenabled;
int32_t position_after_reenabled;

#################

Tests the general functionality and behaviour of stepper drivers to ensure conformity to the stepper
api specification. All new stepper drivers should pass this test suite. It should be noted, that this
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All new stepper drivers should pass this test suite.

This is actually a process decision, tbh i have not seen this in any other driver subsystem till now.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am aware of this. That said, having a test suite that every new driver should pass (excluding exceptional circumstances with a good reason, I am not ruling these out) would improve the quality and consistency of new stepper drivers.

There are multiple possible reasons why other driver subsystems have not done this. One of them is finding a common core of functionality to set as a criteria (which can be seen even with stepper drivers, considering I split it in 2 test suites). The counter drivers for example have an extensive test suite, but it is missing tests for some functions while having tests not every driver can pass.
Another reason is that it might be very difficult to test these drivers in a meaningful way.
Then there the large number of existing drivers that might fail, which would result in many changes to these. In that regard, the low number of stepper drivers is a boon, as it makes this quite manageable.
And finally, people might not have want/been able to spend the required resources on this.

Another thing to consider is that with the growing commercial importance of Zephyr, testing will become more important and the stepper api could get a headstart on this.

Also, the goal of these tests is to test that a driver fulfills the api specification. If a new driver fails these tests, it does not fulfill the specification, so why would we want to accept it?

If we do it this way, there will be followup discussions about issues we find and about clarifications, but I think it is an important first step.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And finally, people might not have want/been able to spend the required resources on this.

This is a very good reason to not increase the entry bar for new drivers.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And finally, people might not have want/been able to spend the required resources on this.

This is a very good reason to not increase the entry bar for new drivers.

This was in reference to spending the resources creating such a test suite.
Making the test suite mandatory doesn't really increase the entry bar for new drivers noticeably. In fact, one could argue that it lowers it. When developing a new driver you want to use the test suite during development anyway, as it is an easy way to check that your driver is doing the correct things. One will still need to use the shell on occasion, but the tests make the development process much easier. And with this background, succeeding the test suite becomes a normal part of developing a driver. Thus not raising the entry bar.
In the end, we have something akin to Test Driven Development, but with the tests already existing.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am aware of this. That said, having a test suite that every new driver should pass (excluding exceptional circumstances with a good reason, I am not ruling these out)

What would be these exceptional circumstances?

I have mentioned this in one of the earlier comments as well, testing the purely gpio-based drivers is easy and as far as i can see can be very well covered by such test suites. But drivers like tmc5xxx or for that matter even tmc2209(only uart based) would need some sort of emulator. So what would we do in this case?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What would be these exceptional circumstances?

I'm not sure until it happens. As these tests are not perfect, there is always the possibility that there is some edge case or another reason that I did not consider In that case, these will need to be discussed. That is what I meant to say with this.

I have mentioned this in one of the earlier comments as well, testing the purely gpio-based drivers is easy and as far as i can see can be very well covered by such test suites. But drivers like tmc5xxx or for that matter even tmc2209(only uart based) would need some sort of emulator. So what would we do in this case?

Those that can be tested via native_sim/qemu are, the rest can't be tested this way. That is a limitation of Zephyr, that some things can only be tested on real hardware. If however, someone develops a new driver, they have the means to test it. In that case we either have to trust their word or ask for the test log. Its one of the reasons, why only new tests should need to pass the test suite, as at that point there is somebody that can test the driver, later on that is however no longer the case.
The ideal solution in these cases would be for someone to develop an emulator, or at least stubbs for testing under native_sim/qemu but that is a lot of work and would, as you mentioned, raise the bar for entry, which is why I consider that nice to have but not required (and there is an argument to be made, that these emulators should be developed by the hardware developers, but that is a whole other discussion).

step-gpios = <&arduino_header 10 0>; /* D4 */
en-gpios = <&arduino_header 11 0>; /* D5 */
msx-gpios = <&arduino_header 12 GPIO_ACTIVE_HIGH>, /* D6 */
<&arduino_header 13 GPIO_ACTIVE_HIGH>; /* D7 */
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

		msx-gpios = <&arduino_header 12 GPIO_ACTIVE_HIGH>, /* D6 */
			    <&arduino_header 13 GPIO_ACTIVE_HIGH>; /* D7 */

Copy link
Member

@jilaypandya jilaypandya Mar 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the upside of adding these overlays for so many different boards?

Copy link
Contributor Author

@jbehrensnx jbehrensnx Mar 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Partly they are the result of me testing these drivers on these boards too see how they perform. The other reason is that counters and possibly the working queue can behave slightly different between boards. And the step-dir drivers sometimes use the step-dir implementation in slightly different ways. Thus, testing drivers on many different boards can improve the chance of finding errors.

Copy link

github-actions bot commented Jun 3, 2025

This pull request has been marked as stale because it has been open (more than) 60 days with no activity. Remove the stale label or add a comment saying that you would like to have the label removed otherwise this pull request will automatically be closed in 14 days. Note, that you can always re-open a closed pull request at any time.

Adds a check for 0 steps to take for move_by in both
step_dir_stepper_common and the h-bridge driver.

Signed-off-by: Jan Behrens <[email protected]>
@jbehrensnx
Copy link
Contributor Author

Next attempt. Moved all tests from drv84xx/api to stepper_api. I also had to make some fixes to existing drivers to fix some bugs.

This is still work in progress.

One addition is the drivers.stepper.stepper_api.generic test scenario. It is intended as an entry point for tests on physical hardware. The existing test scenarios don't work, as all of them load additional overlay files, which causes issues. That said, I am not really happy with my solution (and it has its own issues), so I am open to other ideas.

Also, before somebody asks, I removed the LOG_DBG in stepper_print_event_callback because it would cause native_sim to crash.

{
int32_t pos = 10u;
int ret;
(void)stepper_enable(fixture->dev);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think, a better check of these calls is necessary, rather than void-ing out the returns. You can do something like:

zassert_equal(stepper_enable(fixture->dev), 0, "Failed to enable device");

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it is still necessary. Enable only charges the coils and has no impact on the stepper behavior otherwise. That said, a dedicated test for checking enable/disable would be an option. Checking it in every test case would be overkill imo.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it is still necessary.

Could you justify, why asserting the function call is not necessary? You would never know if the function call was erroneous if you void the return.

And this is not a unique pattern. Here are some examples from the ADC, RTC or EEPROM api tests.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you justify, why asserting the function call is not necessary? You would never know if the function call was erroneous if you void the return.

As written above, since one of the recent changes to the stepper api, the enable/disable functions no longer influence the step-logic. Thus their success or failure is irrelevant. Enable/disable is currently used because it was previously necessary and they would appear in a workflow on physical hardware, but I am currently wondering if it would be better to just remove them entirely.

@jilaypandya What is your opinion, should the enable/disable function calls be just removed? They no longer fulfill the functionality they previously did.

Copy link
Member

@jilaypandya jilaypandya Jul 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One should always assert the return values of function calls in tests.
you can do something like this
zassert_not_equal(stepper_enable(fixture->dev), -EIO, "Failed to enable device");, coz if the enable returns -ENOTSUP or 0, the test-suite shall pass.

you can move enable in before, so that you don't have to repeat it in each and every test.

@dipakgmx thanks for pointing this out, i will make sure i do this as well.

@github-actions github-actions bot removed the Stale label Jun 28, 2025
@jbehrensnx
Copy link
Contributor Author

jbehrensnx commented Jul 3, 2025

The drivers.stepper.stepper_api.generic issues can be seen in the failed tests as I would have to exclude all boards the ci is run on, which is not an option. I am thinking about instead allowing (and adding an overlay for) a single board for this test scenario. This should fix the current issue with .generic and allow developers to use it with the -K twister flag.

If a physical board is still a no-go, then allowing a single emulation platform (maybe qemu_x86_64/atom, as it is already used for the h-bridge driver) with a conventional overlay file is also an option.

EDIT: Edited

@jilaypandya
Copy link
Member

The drivers.stepper.stepper_api.generic issues can be seen in the failed tests as I would have to blacklist all boards the ci is run on, which is not an option. I am thinking about instead whitelisting (and adding an overlay for) a single board for this test scenario. This should fix the current issue with .generic and allow developers to use it with the -K twister flag.

If a physical board is still a no-go, then whitelisting a single emulation platform (maybe qemu_x86_64/atom, as it is already used for the h-bridge driver) with a conventional overlay file is also an option.

https://docs.zephyrproject.org/latest/contribute/coding_guidelines/index.html#rule-a-2-inclusive-language

@jbehrensnx
Copy link
Contributor Author

Removed the test filtering and improved the range check for the run tests.

I also removed the .generic scenario. I somehow forgot, when creating it, that the additional overlay files are platform filtered. The error that made me believe otherwise was something else on my side.

@@ -18,6 +19,8 @@ struct k_poll_signal stepper_signal;
struct k_poll_event stepper_event;
void *user_data_received;

#define TEST_STEP_INTERVAL (20 * USEC_PER_SEC)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This has to be a Kconfig Configuration Option

{
int32_t pos = 10u;
int ret;
(void)stepper_enable(fixture->dev);
Copy link
Member

@jilaypandya jilaypandya Jul 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One should always assert the return values of function calls in tests.
you can do something like this
zassert_not_equal(stepper_enable(fixture->dev), -EIO, "Failed to enable device");, coz if the enable returns -ENOTSUP or 0, the test-suite shall pass.

you can move enable in before, so that you don't have to repeat it in each and every test.

@dipakgmx thanks for pointing this out, i will make sure i do this as well.

ZTEST_F(stepper, test_set_micro_step_res_valid)
{
int ret =
stepper_set_micro_step_res(fixture->dev, CONFIG_STEPPER_TEST_MICROSTEP_RESOLUTION);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks a bit strange. Most probably happened because of clang-format.
Adjust the code as following.

	int ret; 
        ret = stepper_set_micro_step_res(fixture->dev, CONFIG_STEPPER_TEST_MICROSTEP_RESOLUTION);

}

ZTEST_F(stepper, test_target_position_w_fixed_step_interval)
Copy link
Member

@jilaypandya jilaypandya Jul 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was this test similiar to a test in drv84xx test suite, and hence been deleted?

Please add the tests from drv84xx to stepper_api in a commit and then delete the duplicate test cases and do corresponding adjustments to stepper_api test suite in another commit. Thanks.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moved the removal to the drv84xx removal commit and renamed it to clarify that it removes superfluous tests.

Separating modifications to the tests into a separate commit would be very difficult at this point, especially, as some changes are needed to run the tests. That said, I have adjusted the commit message to indicate that there also some modifications to tests.

{
int32_t pos = 10u;
int ret;
(void)stepper_set_microstep_interval(fixture->dev, CONFIG_STEPPER_TEST_MICROSTEP_INTERVAL);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One of the previous review comments about asserting the return values of stepper_api functions needs to be addressed. Thanks.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Considering how often this function is called, I moved it into the before function and added a separate test for this. Because these tests are only for non-acceleration drivers, this should suffice.

Moves the tests of the drv84xx/api test suite into the stepper_api test
suite. To facilitate this and for cleanup purposes, several tests are
modified.

Signed-off-by: Jan Behrens <[email protected]>
Removes the drv84xx/api test suite as well as a duplicate test in
stepper_api.

Signed-off-by: Jan Behrens <[email protected]>
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Deprecate/Remove Driver Specific API tests in stepper area.
4 participants