16:10:27 <balloons> #startmeeting QA Community Roundtable
16:10:27 <meetingology> Meeting started Thu Jun 21 16:10:27 2012 UTC.  The chair is balloons. Information about MeetBot at http://wiki.ubuntu.com/meetingology.
16:10:27 <meetingology> 
16:10:27 <meetingology> Available commands: #accept #accepted #action #agree #agreed #chair #commands #endmeeting #endvote #halp #help #idea #info #link #lurk #meetingname #meetingtopic #nick #progress #rejected #replay #restrictlogs #save #startmeeting #subtopic #topic #unchair #undo #unlurk #vote #voters #votesrequired
16:11:03 <patdk-wk> hm?
16:11:43 <balloons> Julien and Phil from ubuntu both don't seem to be around ;-(
16:11:57 <balloons> checking on kubuntu folks then we'll get started
16:12:44 <Effenberg0x0> balloons: you had scheduled 2 hours. Maybe they will still make it
16:13:56 <balloons> ok, pm's sent to everyone ;-)
16:14:05 <balloons> so, let's start
16:14:36 <balloons> [TOPIC] UTAH Demo
16:14:48 <balloons> huh, bot doesn't like topics
16:15:03 <balloons> well, anyways, first item on the agenda was to demo UTAH
16:15:18 <tgm4883> balloons, perhaps #topic
16:15:21 <balloons> don't think we can get a demo together, but hggdh can explain a little bit about what it is
16:15:29 <balloons> #topic UTAH Demo
16:15:40 <superm1> i thought there was going to be a google hangout or somethign to show it off?
16:15:52 <tgm4883> hmm, meetingology apparently is a liar when it comes to available commands
16:15:53 <hggdh> no, no demo right now, unfortunately
16:16:36 <balloons> superm1, hggdh should be able to demonstrate how vm's are used in automated testing. but sadly, not a UTAH demo atm
16:16:39 <superm1> so for now everyone stare at http://www.enchantedlearning.com/usa/states/utah/map.GIF and imagine as hggdh describes :)
16:16:51 <balloons> :-)
16:17:01 <Effenberg0x0> I like the green. Very vivid.
16:17:32 <hggdh> UTAH (Ubuntu Testing Automation Harness) is the new baseset we are moving to for automated testing on Ubuntu
16:18:15 <hggdh> it will be (when fully implemented) flavour-agnostic, so it can be used for all *ubuntu
16:18:44 <hggdh> right now, beginning of development, it only supports VMs (via libvirt)
16:18:44 <superm1> is the input it takes an ISO image?
16:19:03 <hggdh> yes, it uses ISO images to build the target test environment
16:19:23 <superm1> specifically, "desktop" ISO images, not alternate
16:19:30 <hggdh> the installation is preseeded, so there is no input required.
16:19:51 <hggdh> not only desktop, but also alternate and server
16:19:58 <superm1> okay i see
16:20:13 <superm1> sorry, i can hold off questions until the end if you would like, i just realized this might be a bit rude to interject
16:20:16 <hggdh> (which is to say, ubiquity and debian-installer based installs)
16:20:38 <hggdh> superm1: no, please shoot the questions as we go
16:20:51 <hggdh> I do not mind :-)
16:21:07 <superm1> Ok. will you have a place to put preseeds in this tool for the different flavours?
16:21:21 <hggdh> this, on the other hand, means that you might have to adjust the preseeds to your specific needs
16:21:25 <hggdh> yes, we will
16:21:29 <superm1> i know at least mythbuntu and ubuntu studio do have custom preseeds
16:21:31 <superm1> Ok
16:22:07 <hggdh> being able to adjust preseeds is pretty much a requirement if you want to automate tests
16:22:37 <superm1> so the unfortunate flaw in doing it this way that comes to mind is that sometimes you will have bugs that are exposed only when the installation is ran in an interactive mode or only in an automatic mode etc
16:23:12 <superm1> so it can't be a complete replacement to lots of user testing, but instead a valuable supplement
16:23:19 <hggdh> UTAH was originally called 'uath'. But we found that (1) nobody could pronouce it, and (2) it means 'fear, horrorĀ“ in ancient Gaelic
16:23:57 <hggdh> generically speaking, automated testing *cannot* replace actual hands-on
16:24:28 <hggdh> it is just a way of getting the bits that do not directly depend on user input tested, and out of the way
16:24:43 <hggdh> but we still need to have manual installs, and testing
16:25:01 <Effenberg0x0> hggdh: Just to be clear, can you mention some cases in which UTAH will raise a flag?
16:26:05 <hggdh> yes
16:27:35 <hggdh> let's say you are testing upgrades from desktop -- it is easily automated: having an existing (say) Precise install, you run 'sudo do-release-upgrade -d', and use pexpect to drive the answers
16:28:08 <hggdh> in this case, we are looking for failures to upgrade -- missing pre-reqs, new package version fails to install, etc
16:28:52 <hggdh> of course, we cannot check for positioning of windows, and text visibility in windows, etc. But we will get the "backoffice" errors
16:29:55 <hggdh> or you are testing a specific package with the testset provided by it (say, mysql, or even coreutils). So you install the image at the version of Ubuntu you want, and run these tests
16:30:30 <hggdh> (for coreutils, you need to _build_ the package again, coreutils tests are intermixed with the build process)
16:30:55 <hggdh> but you will get errors because of a change on libraries
16:31:34 <superm1> what kind of failures get raised in the upgrade testing?  will bugs get filed?
16:31:39 <hggdh> or you want to check on ecryptfs -- we have a set of tests for it, and they are fully automated
16:32:18 <hggdh> no bugs are opened automagically. We still need to look at the failures and identify the cause
16:32:55 <Effenberg0x0> hggdh: got it. Some debugging skills are needed by a human tester.
16:33:13 <hggdh> the point here is to weed out false positives -- errors caused by test code, not by what is actually being tested
16:33:23 <hggdh> Effenberg0x0: always
16:33:59 <hggdh> when you are testing you have to look for real and false positives and negatives
16:34:24 <hggdh> usually, we assume the negatives (which is to say, the expected results) are correct
16:35:07 <hggdh> but, every so often, you should look at your "correct" results, and verify they are indeed correct
16:35:12 <superm1> there will be some sort of notification mechanism for those flavors interested when things fail and need some further intervention?
16:36:38 <hggdh> UTAH has a provision to alert users. YOu can set it and use it. In our case, most of the UTAH tests will be run via Jenkins (http://jenkins-ci.org/), so we use Jenkins to do the alerting
16:37:11 <hggdh> another point we all should be careful on is on destructive/non-destructive tests
16:37:35 <hggdh> I personally define a destructive test as a test that unilaterally changes your system configuration
16:38:07 <hggdh> look at 'change your system configuration' as 'can destroy your system'
16:38:44 <hggdh> it does not matter if it actually completely borks, but it might -- for example -- change the DNS resolution
16:39:29 <superm1> will flavours be able to run utah in the canonical jenkins instance, or need to set up their own?
16:39:41 <hggdh> on the other hand, this Monday we had a ecryptfs test that actually forces a reboot of the system: a piece of the kernel goes haywire, and I/O to ecryptfs cannot be guaranteed to work until the reboot
16:40:46 <hggdh> superm1: I cannot really answer that, sorry. But... we -- right now -- do not have the resources to guarantee test space for the flavours
16:41:10 <superm1> OK
16:41:35 <superm1> Daviey warned that jenkins is a PIA to get setup
16:41:38 <hggdh> so, at least right now, please do not expect we will be able to run tests for other than Ubuntu
16:42:04 <Effenberg0x0> hggdh: The typical ISO-Test (Install/Partitioning, writing MBR/Grub, installing/removing drivers/kernel modules like VGA) is handled by UTAH?
16:42:18 <hggdh> Effenberg0x0: yes indeed
16:42:36 <hggdh> superm1: not really a pita, but it does have a learning curve
16:43:06 <hggdh> the easiest way of using UTAH would be to deploy VMs
16:43:32 <superm1> can UTAH be ran without jenkins then on its own via VM deployments?
16:43:44 <hggdh> deploying bare-metal will, of course, require bare-metal and additional packages/networks (so that MAAS, for example, can be deployed)
16:44:06 <hggdh> superm1: yes, it can, and this is how I was starting to test it
16:44:13 <superm1> ah great
16:44:16 <balloons> if I can interject, it would also be good to throw some links at you for this : https://launchpad.net/utah
16:44:24 <hggdh> all you need is libvirt and friends
16:44:42 <hggdh> balloons: thank you, did not really have time to prepare
16:44:56 <superm1> so really need to just check it out and start playing to see where questions crop up
16:44:58 <balloons> there's a wiki off that page with more info and a picture
16:45:46 <hggdh> (and I am, right now, testing a new set of kernel SRU tests, and the KVM I am using is driving me nuts, popping up on my monitor every time the machine thinks about doing something
16:45:59 <balloons> additionally, if you have further specific questions once your playing with it, there's a mailing list setup for it now ubuntu-utah-dev@lists.ubuntu.com
16:46:30 <hggdh> superm1: yes. Not only to learn, but to tell us where we, ah, did something wrong
16:46:57 <superm1> great, thanks!  just need to find some time to actually use it and experiment now :)
16:47:08 <hggdh> the code resides at...
16:47:52 <balloons> it's linked above hggdh https://launchpad.net/utah, lp:utah
16:48:05 <hggdh> heh
16:48:22 <Effenberg0x0> hggdh: How do we prioritize/filter UTAH-Testing bug reports, amidst the constant flow of new bug reports on LP (correct/incorrect ones) everyday?
16:48:35 <hggdh> so please brz branch, and play -- or install the daily utah package from the PPA
16:49:23 <hggdh> Effenberg0x0: what we are doing internally, is to tag all bugs we find (via whatever test process, including manual testing) as a qa finding
16:49:52 <balloons> fyi, there's a daily and stable ppa.. you can find them here: https://launchpad.net/~utah
16:50:00 <balloons> link to stable: https://launchpad.net/~utah/+archive/stable
16:50:34 <Effenberg0x0> hggdh, OK
16:51:22 <balloons> ok anymore questions on utah for hggdh ?
16:53:15 <balloons> alright if not, next up is testcase management
16:53:22 <balloons> which you get to listen to me for ;-)
16:53:46 <balloons> I'll include visuals..
16:53:57 <balloons> I'm wondering if it makes sense for me to type it or speak it
16:54:09 <balloons> I'm concerned if I only have the video it won't make the log
16:54:37 <balloons> I can screenshare and type to you all, or you can view a live feed of me speaking with my desktop :-)
16:55:09 <balloons> let's try the type and view
16:55:11 <balloons> http://www.screenleap.com/ubuntuqa
16:56:16 <balloons> hopefully everyone can see my screen now?
16:56:27 <Effenberg0x0> OK here balloons
16:56:29 <superm1> yup
16:56:42 <cariboo907> OK here too
16:56:56 <balloons> As you know the qatracker just went through an update to bring testcase management
16:57:20 <balloons> I'm going to show you how it works from a behind the scenes admin perspective
16:57:29 <balloons> we'll use the staging site on dev.stgraber.org
16:58:00 <balloons> so I have a couple products laid out here, mimicking the iso.qa.ubuntu.com tracker
16:58:29 <balloons> testcases and pages look pretty similar, except for the addition of the extra links
16:59:02 <balloons> clicking on a test (http://packages.qa.dev.stgraber.org/qatracker/milestones/224/builds/16281/testcases/1305/results) you can see there are some new links, and the testcase is now included inline
16:59:09 <balloons> hi highvoltage!
16:59:21 <highvoltage> hi balloons :)
16:59:23 <balloons> http://www.screenleap.com/ubuntuqa
16:59:29 <balloons> you can watch and follow along
16:59:56 <balloons> ok, so that's nice having the testcase in there
17:00:13 <balloons> you'll also notice the boilerplate text on the bottom for submitting your result and filing a bug
17:00:49 <balloons> the bug reporting instructions are currently set at a product level
17:01:11 <balloons> the testcases are defined and then grouped into testsuites which can then be used by any product
17:01:14 <superm1> product meaning "ubuntu" "mythbuntu" etc?
17:01:21 <balloons> superm1, yes
17:01:25 <balloons> or, a package
17:01:35 <balloons> like the calls for testing of the kernel
17:01:40 <balloons> I'll show that quickly
17:02:17 <balloons> you can see we've had a call for testing for the kernel, and it's had 2 versions
17:02:35 <balloons> silly me called the 3.4 version precise1, because I was still learning the tool
17:02:57 <balloons> so, there's one testcase in here, smoke test
17:03:10 <balloons> similar boilerplate text
17:03:16 <balloons> we're looking at (http://packages.qa.dev.stgraber.org/qatracker/milestones/223/builds/16283/testcases/1301/results)
17:03:45 <balloons> filing a bug on this package has some further instructions than normal, and includes a link
17:03:56 <balloons> that link is tagging the bug as well
17:04:31 <balloons> if we look at installation instructions for the package, we get a howto install and howto uninstall
17:05:08 <balloons> finally the detailed infromation on the testcase has the history of what the testcase looked like
17:05:18 <balloons> this is useful for when we update a case, but have old results
17:05:26 <balloons> the results and case will match up
17:05:57 <balloons> So you can see some of my earlier attempts and playing with formatting, etc
17:06:20 <balloons> ok, so let's take a look at the admin side of things
17:07:02 <balloons> As you can see, we allow you to define a template for new testcases
17:07:21 <balloons> heh, I changed it a bit
17:07:30 <superm1> does that mean we need to work through an admin like you to make our own test cases?
17:07:55 <balloons> err, actually.. there
17:08:10 <balloons> the template has gone thru a couple revisions before going live
17:08:24 <balloons> superm1, no it doesn't mean you'll need me to help you make testcases
17:08:29 <balloons> we'll get to that in a min ;-)
17:09:13 <balloons> ok, so anyways, that's the example template for a testcase as decided upon last dec by the qa community
17:09:31 <balloons> it could change of course if we decided to change it.. and if so, we could update the template at that time
17:09:32 <balloons> :-0
17:09:46 <balloons> ok, so let's look at the testcases quickly
17:10:16 <balloons> you can see my smoke test mostly follows the template -- barring it has no expected results.
17:10:28 <balloons> but the admin side, you simply title it, and add the testcase
17:10:37 <balloons> notice we do use some html / drupal markup
17:10:50 <balloons> some is allowed and we use it to help display as well as for machine parsing
17:11:21 <balloons> here's an example of an isotest testcase
17:12:00 <balloons> this wiki page http://testcases.qa.ubuntu.com/Install/DesktopFree has been converted into the testcase you see
17:12:23 <balloons> http://packages.qa.dev.stgraber.org/qatracker/milestones/224/builds/16281/testcases/1306/results
17:12:50 <balloons> ok, so that's testcases more or less
17:12:57 <balloons> now, we can organize testcases into testsuites
17:13:16 <balloons> you see the 'free software only' testcase is part of the ubuntu desktop suite
17:13:32 <balloons> well.. 'ubuntu desktop extras' actually :-)
17:13:43 <balloons> there are 3 tests defined in here you can see
17:13:51 <balloons> I can set the order, weight and status as expected
17:14:25 <balloons> now, let's go assign our testcases/testsuites to a product
17:15:02 <balloons> you can see I've linked both the 'ubuntu desktop' and 'ubuntu desktop extras' testsuites to the ubuntu desktop amd64 product
17:15:09 <balloons> for precise
17:15:47 <balloons> now, some of this may be rather foreign to all of you, but there is a guide to this interface which I'll link to at the end
17:16:06 <balloons> the new stuff I am showing you is not yet in that guide. since, it's new (as in this week new :-) )
17:16:39 <balloons> now to answer superm1's question a little bit, there is a new role to the admin interface
17:17:03 <balloons> we will have a testcase admin role which will allow you to define and manage the testcases
17:17:13 <balloons> I've setup a team on lp to do this
17:17:24 <balloons> anyone who is on the team will have access to manage testcases
17:17:56 <balloons> https://launchpad.net/~ubuntu-testcase is the team
17:18:18 <balloons> Phil graciously agreed to help trial out this new interface
17:18:39 <balloons> you'll notice the team is restricted
17:18:54 <balloons> my goal is not to prevent anyone from contributing, but some control is needed
17:19:16 <balloons> for anyone running the tests, I would like them to be able to suggest new tests or improvements by filing a bug against ubuntu-qa
17:19:52 <balloons> like any other community, make a few contributions and it will make sense to make you an admin should you wish to be
17:20:07 <balloons> this is all VERY new, so I'd love input from all of you on how to shape this
17:20:27 <balloons> suffice to say, the flavors should all have at least one person who has this access
17:20:34 <superm1> that's what i was just going to say
17:20:41 <superm1> i'm glad tgm4883 volunteered for mythbuntu
17:20:47 <balloons> :-)
17:20:48 <mrand> hahaha
17:21:50 <balloons> I'd like everyone on the team to also make sure the tests stay maintained and not fall into dis-use or out of date
17:22:05 <balloons> so it's a bit of responsibility, but not too much I don't think
17:22:16 <balloons> more or less if your active in testing, you would be doing / have done this anyway
17:22:42 <balloons> so questions?
17:23:01 <Effenberg0x0> Balloons: Doesn't it sort of overlap with UTAH?
17:23:16 <balloons> besides the testcase management piece, the new qatracker should be able to support all of our testing needs (insomuch as we want to have a test and record results)
17:23:30 <balloons> it's my hope we can consolidate what we're doing by using it
17:23:33 <balloons> Effenberg0x0, how so?
17:23:52 <balloons> This is intended for manual testing, UTAH is intended for automated testing
17:24:15 <balloons> of course, over time we will continue to close the gap on that.. how/where the results get recorded, etc
17:24:49 <Effenberg0x0> I know, I mean some automated tests might kill the need for some manual test-cases
17:24:59 <Effenberg0x0> How to keep things paired up
17:25:16 <balloons> Effenberg0x0, my longer term view is to automate away everything that makes sense
17:25:45 <balloons> the pairing up if you will of what is automated vs manual happens via our communication
17:26:12 <Effenberg0x0> Ok, so the community looks at test-cases and define what's still valid or not, got it
17:26:14 <balloons> this cycle once we have migrated all of our pre-existing tests over I would like to see us take a review of them
17:26:32 <balloons> I took a work item personally to review the testcases for iso testing to ensure they make sense, are needed, etc
17:26:57 <balloons> but yes, as a community, on an on-going basis, we should help frame what is needed for manual testing and where it makes sense for us to help
17:27:16 <balloons> example of this is the kernel tests. you notice the "smoke tests" are rather simple and light
17:27:32 <balloons> the intense tests are being automated by hggdh and the canonical and kernel teams
17:27:57 <balloons> the manual testing piece of that is getting it out to many more different workloads and bits of hardware
17:28:20 <Effenberg0x0> Ok
17:28:50 <balloons> anything else?
17:29:08 <balloons> If not we'll migrate onto the next piece and i'll shut down the screenshare for the time being
17:30:05 <balloons> ok, so we talked about the new tracker and testcases
17:30:27 <balloons> briefly I wanted to mention milestones.. Kate asked me to remind the flavors that you can skip milestones
17:30:49 <balloons> but if you do commit to doing a milestone she asks you do it with full force.. aka, you see it through to the end ;-)
17:31:26 <balloons> feel free to interrupt me at any time btw
17:31:40 <balloons> next up we wanted to discuss collaboration
17:31:52 <superm1> i haven't kept up with the thread about abolishing milestones, but what if that happens?
17:32:07 <superm1> have you thought about how the tracker would scale for that?
17:32:11 <balloons> superm1, yes, that thread has grown and been quite a discussion
17:32:58 <balloons> from a tools point of view, our "milestones" can remain the same.. We can nothing but dailies for iso testing if needed, and calls for testing are already of variable length
17:33:22 <balloons> as far as what's going to happen, I am not completely sure as it's still be discussed
17:34:32 <balloons> however, from a ubuntu perspective we're being asked (again, we were asked at UDS / before UDS) to test more regularly; meaning not just at milestones
17:35:19 <balloons> from that perspective nothing in theory has changed.. but the cadence of exactly when we do this "regular" testing is being suggested to be 2 weeks
17:35:26 <superm1> i see, okay
17:35:35 <balloons> the schedule we adopted after UDS was pretty much the same.. about every 2 weeks
17:35:44 <superm1> and i understand that flavours can follow the cadence they would like in this regime too
17:36:03 <balloons> yes, of course flavors can choose to follow the cadence or not
17:36:20 <balloons> I recommend adopting a cadence that works for the flavor
17:36:26 <balloons> considering the devs, testers, etc
17:36:41 <balloons> hence, adopting every ubuntu milestone doesn't always make sense..
17:36:52 <GridCube> (i could speak very very unoficially for xubuntu)
17:36:55 <balloons> I liked the LTS only approach some flavors have thought about as well
17:37:01 <balloons> GridCube, hello :-)
17:37:05 <GridCube> :)
17:38:02 <balloons> on colloborating, part of us all getting together is to talk about needs we might have and how we can work together to benefit each other
17:38:25 <balloons> maintaining testcases in mutual fashion and sharing them across flavors as it makes sense, is one such example
17:39:03 <balloons> but I also think we can do things like collaborated calls for testing (like we have done with the kernel testing), or on specific packages that mutliple flavors use like firefox
17:39:29 <balloons> whatever it is.. floor is open to whomever has a need or idea for colloberation
17:40:22 <GridCube> o/
17:40:28 <balloons> yes GridCube
17:40:39 <balloons> brb, type away :-)
17:41:05 <GridCube> hello, i present myself, im a bug tester and support collaborator for xubuntu, i've been so for over a year now, never participated here tho.
17:42:22 <GridCube> i have proposed a small change on the qa tracker a while ago, it was that the tests cases should have some sort of area, where all the reported bugs for that particular case on previous days could be seen
17:42:54 <tgm4883> wait, what
17:43:08 * tgm4883 scowls at superm1
17:43:15 <superm1> ;)
17:43:27 <balloons> that delay!
17:43:27 <balloons> epic
17:43:37 <balloons> GridCube, this exists: http://iso.qa.ubuntu.com/qatracker/reports/defects
17:44:46 <balloons> additionally when you look at a testcase, it has a list of previously reported bugs
17:44:52 <balloons> but only for that build I believe
17:45:04 <GridCube> only for that build
17:45:05 <balloons> regardless, it's worth filing a bug to discuss how it might look
17:45:12 <balloons> I like the idea
17:45:26 <GridCube> the wishlist bug is already there
17:45:30 <GridCube> im trying to find it
17:45:40 <balloons> GridCube, ahh, ok, point it out to me.. I'll subscribe
17:46:35 <GridCube> ok
17:47:19 <GridCube> give me a sec
17:47:29 <balloons> ok, so anything else.. This last piece is general Q & A time. hggdh is still around (though dealing with his flailing computer) feel free to ask any more questions
17:47:45 <GridCube> https://bugs.launchpad.net/ubuntu-qa-website/+bug/994816
17:48:15 <GridCube> :)
17:48:26 <balloons> it appears like that was implemented?
17:48:49 <balloons> ahh I think you mean https://bugs.launchpad.net/ubuntu-qa-website/+bug/994812
17:49:16 <GridCube> ahm, it was like "granted" but no one actually implemented it
17:49:25 <balloons> I subbed.. I'll look into it.
17:49:25 <GridCube> balloons, yes those two go thogether :)
17:49:28 <balloons> thanks for the suggestion
17:50:21 <GridCube> no problem, i just though it would make reporting and testing easier, because people would know what to look at
17:50:35 <hggdh> I am here
17:51:00 <balloons> ok, so if there's no more questions we can discuss the testcase management stuff quickly.
17:51:25 <balloons> The old wiki needs converted, of which I have done a couple.. In addition to converting I placed them into the new templated format
17:52:03 <balloons> you can see everything in powerpc and amd64+mac has the new testcase format
17:52:03 <balloons> http://iso.qa.ubuntu.com/qatracker/milestones/219/builds/17586/testcases
17:52:30 <balloons> I take that back, heh, 'Live Session' doesn't
17:53:03 <balloons> So I will be going through and doing the same with the other testcases used by the ubuntu iso's
17:53:12 <balloons> however, many of those testcases are used by the flavors as well
17:53:57 <balloons> for instance, the xubuntu desktop i386 testcases
17:54:05 <balloons> they use those same 4 tests
17:54:29 <balloons> the only one not converted is the wubi (windows installer) testcase
17:55:03 <balloons> convert that and you can convert that iso over
17:55:40 <balloons> of course, the flavors can then pick and choose which tests to keep in common and which to write for themselves
17:55:58 <balloons> you can pull any testcase you wish and include it with any other combination of testcases to go into a testsuite
17:56:21 <balloons> the examples I gave that are live actually are 2 testsuites, so that even the testsuites can be shared across multiple isos
17:57:04 <balloons> so for example, the xubuntu desktop i386 iso can use the 'ubuntu desktop' testsuite which contains all 4 testcases it's using, barring the wubi testcase
17:57:34 <balloons> write the wubi testcase and then add it to a testsuite and include it on the iso
17:57:40 <balloons> I trust that all makes sense :-)
17:58:36 <balloons> So, please ping me to get access to help out in this area. Your iso's and testcases will remain usable as-is, (you can see the legacy mode on the tracker), but can be converted now at any time
17:59:09 <balloons> let me know who is interested in joining this new team and I'll help get them going on how to use the tool, etc
18:00:00 <balloons> and with that I think we're done. ;-) Thanks to you all for coming out. I appreciate your time. And thanks for suggesting we meet during the cycle. I think it was helpful
18:00:10 <balloons> #endmeeting