> I'm not opposed to it, if that's what the maintainers prefer, but I'd like you to think about the toll work that this creates for me to constantly rewrite stuff that seemed to be agreed upon befo...
> This way all issues will be detected in one pass.
Not bad, this kind of handling is what I've missed in so many tools I have used in my life which only failed at the nearest problem.
The unfortunate reality here is that it's a nested model, so to IMO do it by the book would mean to effectively validate the lockfile twice - once as a dict from the YAML parser, loop over all arti...
Pairs is used here as a general concept, not a typing notation. I will change it to reflect L30 better.
> nitpick: Do we want this to be a map? I mean most commonly (those I have seen) are alway...
Fully agree with @brunoapimentel that all of those but one tests are isolated to lockfile validation which doesn't integrate with any external functionality and so unit tests are sufficient.
nitpick: Do we want this to be a map? I mean most commonly (those I have seen) are always string lists. I know that it probably makes it just a bit simpler to get a checksum object out of it, but I...
Because you are checking if something already exists in the set. That seems a bit off to me. Sets do not contain duplicate values. That's why I raised the question.
I agree that we can drop the lockfile format related tests, as they are well covered with unit tests. However I'd argue that we should keep those that verify lockfile targets in some way, as those ...
Thanks. There was a couple of options, but based on previous review comments, this seems to be the preferred option. I'm aware of the downsides that come with it.