# Wednesday, August 06, 2008

Read on Nikolai's announcement on the latest drop of Pex.

posted on Wednesday, August 06, 2008 11:06:59 AM (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
# Thursday, July 31, 2008

Alexander Nowak has started a blog post chronicle on Pex and already has 6 episodes to it!

  • Pex - test Case 5 (regular expressions)
  • Pex - test case 4 (strings and parameter validation)
  • Pex - Test case 3 (enums and business rules validation)
  • Pex - test case 2 
  • Pex - test case 1
  • Starting with Pex (Program Exploration)

    The posts give a nice point of view of Pex from a user perspective, and against classic testing techniques such as equivalence classes.

  • posted on Thursday, July 31, 2008 7:25:49 AM (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
    # Tuesday, July 29, 2008

    Linear programming problems are usually solved using the simplex algorithm. While it is easy to encode a constraint system of linear equalities and inequalities as a Parameterized Unit Test for Pex, there is currently no way to tell Pex that we want test inputs that are “minimal” according to a custom objective function. However, Pex can still generate *surprising* feasible solutions.

    Let's start with a simple set of linear inequalities that define our problem.

    [PexMethod]
    public int Test(int x, int y)
    {
    // PexAssume is used to add 'constraints' on the input
    // in this case, we simply encode the inequalities in a boolean formula PexAssume.IsTrue( x + y < 10 & // using bitwise & to avoid introducing branches 5 * x + 2 * y > 20 & -x + 2 * y > 0 & x > 0 & y > 0);
    // the profit is returned so that it is automatically logged by Pex return x + 4 * y; }

    After running Pex, we get one feasible solution. It is not optimal as expected since we don't apply the simplex algorithm.

    image

    Enter overflows

    Remember that .Net arithmetic operations will silently overflow unless you execute them in a checked context? Let's push our luck and try to force an overflow by changing x > 0 constraint to x > 1000:

    [PexMethod]
    public int Test(int x, int y)
    {
        PexAssume.IsTrue(
            x + y < 10 &
            5 * x + 2 * y > 20 &
            -x + 2 * y > 0 &
            x > 1000 & y > 0
        );
        return x + 4 * y;
    }

    Z3, the constraint solver that Pex uses to compute new test inputs, uses bitvector arithmetic to find a surprising solution that fulfills all the inequalities (our profit has just gone of the roof :)).

    image 

    Z3 is truly an astonishing tool!

    Checked context

    In order to avoid overflow, one should use a checked context. Let's update the parameterized unit test:

    [PexMethod]
    public int Test(int x, int y)
    {
        checked
        {
            PexAssume.IsTrue(
                x + y < 10 &
                5 * x + 2 * y > 20 &
                -x + 2 * y > 0 &
                x > 0 & y > 0
            );
        }
        return x + 4 * y;
    }

    In fact, in that case, Pex generates 2 test cases. One test that passes and the other test that triggers an OverflowException (implicit branch).

    image

    Stay tuned for more surprising discoveries using Pex.

    posted on Tuesday, July 29, 2008 6:35:31 PM (Pacific Daylight Time, UTC-07:00)  #    Comments [2]
    # Thursday, July 03, 2008

    Brian Keller dropped by our offices to record a movie on Pex. A good old white board session to explain how pex works:

    http://channel9.msdn.com/posts/briankel/Pex-Automated-Exploratory-Testing-for-NET/

     

    posted on Thursday, July 03, 2008 11:21:54 AM (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
    # Monday, June 23, 2008

    We've added a little filter to Pex that flags potential testability issues in your source code (at least in the context of unit testing). The idea is simple: if your code logic branches over API that depends on the environment (file system, UI, network, time, etc...), you're not doing 'pure' unit testing anymore and it could be flagged as a testability issue.

    Why do we do this anyway?

    The problem is that Pex is very sensitive to those issues: it cannot control the environment. Let's take a look at a simple example where we write a custom stream class with a testability issue in one of the constructor:

    image

    The constructor checks that the file exists. In order to pass that point in the program, you would actually need to create or find an actual file in the file system. You're talking to the file system, this is already integration testing.

    Parameterized drama

    Let's be optimistic and write a parameterized unit test that reads a file:

    image

    Pex comes back with a single test, passing null as fileName (i.e. default(string)). Since Pex does not know how the file system works, it cannot create a valid file name, and that's where the story ends.

    image 

    Help is on the way

    To help you diagnose these issues, we've added specialized logging flags. Pex shows hints in the issue notification area:

    image

    Clicking on that event will give you additional important data: which API was called and the stacktrace at the time of the call :)

    image

    posted on Monday, June 23, 2008 11:08:33 PM (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
    # Wednesday, June 04, 2008

    Ever wonder what was happening inside the ResourceReader (where do those resources come from anyway!).... Check out Nikolai's post on using Pex to test this (complicated) class...

    posted on Wednesday, June 04, 2008 7:17:06 PM (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
    # Saturday, May 24, 2008

    If you're interrested on reading more about Pex, here are some online document that should get you satisfied:

    The tutorial contains hands-on labs and exercises to understand how Pex generates new test cases. Highly recommend if you're interrested on Pex.

    Happy reading!

    posted on Saturday, May 24, 2008 9:00:01 AM (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
    # Thursday, May 22, 2008

    Nikolai broke the news; we've just released Pex 0.5... get it while it's hot!

    We're eager to hear some feedback, so don't hesitate to tell us what you think about it.

    posted on Thursday, May 22, 2008 9:30:00 AM (Pacific Daylight Time, UTC-07:00)  #    Comments [11]
    # Thursday, May 08, 2008
    posted on Thursday, May 08, 2008 8:40:27 AM (Pacific Daylight Time, UTC-07:00)  #    Comments [2]
    # Wednesday, February 27, 2008

    This is one problem for which we don't have an elegant solution yet:

    what is the best way to craft a name for a generated test?

    Let's see an example; given the following parameterized unit test,

    [PexMethod] void Test(int i) {
        if (i == 123) throw ArgumentException();
    }

    Pex would generate 2 tests: i = 0, i = 123. So it seems doable to infer test names such as

    [TestMethod] void Test0() { this.Test(0); }
    [TestMethod, EE(typeof(ArgumentException))]
    void Test123ThrowsArgumentException() { this.Test(123); }

    So what's so difficult about it? Well, most PUT's aren't that simple and as the size of the generated parameter increases, the methods might increase as well (strings getting bigger). Here's a list of potential problems:

    • the method should stay relatively small (less than 80 chars),
    • the parameters might be objects or classes, which do look well in a string format,
    • generated strings might be huge and contain weird unicode characters,
    • the tests might involve mock choices which again do not render well to strings,
    • the more parameters the more cryptic things become

    Our approach: Timestamps

    To the light of all those problems, we've taken the shortcut route in Pex by simply using combination of the parameterized unit test method signature and the timestamp when the test is generated:

    [TestMethod] void TestInt32_20080224_124301_35() { ... }

    This is ugly, how can I change that?

    We've added an extensibility point to support custom test naming scheme. After registering your 'namer', you will get opportunity to craft a test name following your favorite code standard. You will have the generated test, output, exception, etc... at your disposition to make an intelligent choice there.

    Off course, you're also welcome to drop a comment on this post to suggest a better scheme.

    posted on Wednesday, February 27, 2008 12:18:46 AM (Pacific Standard Time, UTC-08:00)  #    Comments [2]