#title #ubuntu-meeting: Ubuntu Friendly Squad Meeting started by brendand at 15:02:08 UTC. The full logs are available at http://ubottu.com/meetingology/logs/ubuntu-meeting/2011/ubuntu-meeting.2011-10-24-15.02.log.html . == Meeting summary == *Scoring system - jedimike *AOB ''ACTION:'' cr3 to add gremlins/detect to checkbox (brendand, 15:47:42) ''ACTION:'' jedimike to prepare a small usability survey about the site (brendand, 16:01:24) Meeting ended at 16:01:59 UTC. == Votes == == Action items == * cr3 to add gremlins/detect to checkbox * jedimike to prepare a small usability survey about the site == Action items, by person == * cr3 ** cr3 to add gremlins/detect to checkbox * jedimike ** jedimike to prepare a small usability survey about the site == People present (lines said) == * brendand (75) * jedimike (32) * cr3 (27) * ara (22) * roadmr (19) * akgraner (15) * meetingology (5) == Full Log == 15:02:08 #startmeeting Ubuntu Friendly Squad 15:02:08 Meeting started Mon Oct 24 15:02:08 2011 UTC. The chair is brendand. Information about MeetBot at http://wiki.ubuntu.com/AlanBell/mootbot. 15:02:08 15:02:08 Available commands: #accept #accepted #action #agree #agreed #chair #commands #endmeeting #endvote #halp #help #idea #info #link #lurk #meetingname #meetingtopic #nick #progress #rejected #replay #restrictlogs #save #startmeeting #subtopic #topic #unchair #undo #unlurk #vote #voters #votesrequired 15:02:33 hello! 15:02:36 hi 15:02:36 hey :) 15:02:45 The agenda for today is: 15:02:47 Scoring system - jedimike 15:02:47 AOB 15:02:58 #topic Scoring system - jedimike 15:03:10 o/ 15:03:11 jedimike - proceed :) 15:03:29 currently, the rules for rejection and failure of submissions are this: 15:03:29 * If you skip all tests in a core category, we reject the submission 15:03:30 * If you skip an individual test in a core category that required external hardware that not everyone might have (writable media, for example), we ignore that test and apply a penalty to the score, but still allow that category to pass if all other tests pass 15:03:30 * If you skip an individual test in a core category that you should have ran, we fail the category and score the system 1 star 15:03:30 That last rule seems far too harsh. I think that skipping an individual test in a core category that we don't allow you to skip should result in the submission being rejected, as we have a lot of 1 star systems that are only 1 star because someone skipped a test. 15:03:43 251 systems, to be precise 15:04:03 o/ 15:04:06 .. 15:04:13 ara - go ahead 15:04:48 I agree that we should change 3rd rule to reject the submission 15:04:56 if a non-skippable test was skipped 15:05:04 o/ 15:05:14 and I would put something in the particpate page to make things clearer 15:05:27 .. 15:05:41 back to you jedimike 15:06:34 I'd also put something in that intermediate "report a problem" page that says, "Can't find your results?" and explains why we reject submissions, and directs them towards either results tracker, or where we need them to look, to view their submission 15:06:36 .. 15:06:54 o/ 15:07:09 ara - shoot 15:07:41 yes, I agree. I think that we don't need to be really fancy on letting people know exactly why it was rejected 15:07:53 a nice explanation of the basics should be fine 15:08:07 and yes ,that explanation should be at Participate and Report a problem 15:08:28 o/ 15:08:29 not sure if pointing to the submissions is needed, though 15:08:29 .. 15:09:01 o/ 15:09:04 just a quick piece, to say that in the long term if we could enforce running of the tests through the checkbox UI that would be nice 15:09:05 .. 15:09:21 roadmr - go ahead 15:09:29 we should make sure that reasons for rejecting submissions are made abundantly clear 15:09:31 o/ 15:09:43 we already see people wondering why their submissions take so long to appear in UF 15:09:54 so if we just start "eating" them there's bound to be some complaints 15:10:00 it goes to being as transparent as possible 15:10:16 o/ 15:10:29 ideally I'd like to see checkbox say "this submission is not UF-complete so it'll be rejected" or something to the point (i.e. instant feedback) but it may not be possible to do :( 15:10:37 still this is something to be looked into, for transparency's sake. 15:10:38 .. 15:10:40 o/ 15:10:55 cr3 - your turn 15:10:57 brendand: dude, I want to write a testing game! seriously though, it doesn't really need to contribute to UF though but might be nice :) 15:11:00 .. 15:11:20 ok, after that, back to ara 15:11:34 not jedimike? 15:11:45 jedimike it is 15:11:47 cr3 - you're right 15:11:57 jedimike? 15:12:02 brendand: /unignore jedimike :) 15:12:18 I agree with roadmr that we do need something to tell the users at least which of their submissions made it and which didn't 15:12:25 we don't need to break down scores or anything 15:12:36 but it might improve the quality of submissions 15:12:54 if we were able to say "X submission didn't get accepted because you skipped audio/xyz" 15:13:17 and it would cut out the "why didnt my results get listed" questions totally" 15:13:18 .. 15:13:40 ara, your turn (and apologies to jedimike) 15:14:19 so, I agree with roadmr, but those are ideas for 12.04 LTS, when we improve the UI 15:14:32 but we need to take them into account, of course 15:15:12 with the UI changes we can make a much better work in letting people know what they have to run for the submission to be accepted in UF 15:15:33 o/ 15:15:44 for now, an explanation of what is a core category and how it works might help 15:15:44 .. 15:15:58 jedimike, go ahead 15:16:45 if we're not going to make the submissions available to the users through UF, can I put in a feature request for results tracker so that a user's test runs are linked to from their page on there? 15:16:45 .. 15:17:03 o/ 15:18:17 we need to remember that even though ubuntu-friendly is in beta, if we don't keep things transparent we risk creating frustration for users 15:18:53 o/ 15:19:39 one challenge i think we have is that we are able to update UF pretty much as we want, but not the source of the tests (i.e. checkbox) 15:20:10 so we need to avoid making changes in UF which really require support from checkbox 15:20:26 this is one of those i think 15:20:27 ... 15:20:37 jedimike - you can go now 15:20:46 oops sorry I am late 15:21:33 akgraner - no problem. we're discussing a possible change to the scoring system. 15:21:50 * akgraner catches up 15:22:00 akgraner - thanks 15:22:03 jedimike? 15:22:05 just to say, transparency is good :) and at the moment it's not clear if your submission has made it or not, and I think if it's possible to make that change on results track to link the user's page to their test submissions it would help (even helps me respons to bug reports!) 15:22:11 .. 15:22:41 o/ 15:22:45 o/ 15:23:02 akgraner - you go first 15:23:47 Is ther anyway to say - not let the skipped question count against the over scoring and not somewhere that these scores don't include skipped questions 15:24:06 overall scoring I meant 15:24:11 ... 15:25:35 I mean I might have skipped a test just b/c I didn't have a USB stick handy that doesn't mean it didn't pass 15:25:43 akgraner - the issue is that, at the moment tests like the audio ones are skippable. and it seems a lot of people are skipping them. but if we only have one submission for a system then we need to make a call about what that means. 15:25:58 me needs to leave now 15:26:02 bye ara 15:26:06 * ara will read the minutes tomorrow 15:26:07 cheers 15:26:19 * brendand continues 15:26:21 brendand, I go back and re-do the test for the ones I skip once I find all my stuff :-) 15:26:22 o/ 15:27:04 we don't want to say that a skipped test means that the component must work, but neither can we say for sure that it means it doesn't 15:27:20 so the best thing to do is probably to not accept these submissions at all 15:27:44 but some people don't have external monitors 15:27:57 so you would disregard their test on that one point 15:28:00 akgraner - but for tests which need special equipment we already have a different rule, so your system can still get a good score if you didn't test external monitor or usb 15:28:14 o/ 15:28:21 ah ok :-) 15:28:30 akgraner: if a test requires special equipment, like a USB stick or external monitor, we allow people to skip it without failing that component 15:29:06 .. 15:29:19 jedimike - you can go ahead now 15:29:28 gotcha - sorry I didn't know that. /me is quiet now :-) 15:29:46 was just going to say that :) and that we need to make that clear on the participate page 15:30:21 and implement what ara said about making it clear that if you skip tests that don't require external equipment 15:30:28 your submission may not be included in the site 15:30:29 .. 15:31:10 akgraner - an example of a test you *can't* skip is audio/alsa_record_playback_internal, since it doesn't require extra equipment 15:31:18 cr3, your turn 15:31:20 if I understand correctly, skippable tests may affect scoring between 3-5, but non-skippable tests are those that may affect scoring between 1-3 15:31:23 I think someone made a point that non-skippable tests should be enforced in the checkbox UI so that people don't get surprised with a crappy score between 1-3 15:31:26 .. 15:31:34 cr3 - that was me :) 15:31:49 cr3 - but the problem is we'd need to SRU that change in 15:32:22 o/ 15:32:55 brendand: oneiric is beta, it could be argued that we're really targeting precise with the ultimate ninja solution 15:33:27 actually, if i could expand on this. i'm not sure i feel too awesome about the 'skippable''ness of tests being encoded in the u-f site itself 15:33:42 it should really be encoded in checkbox 15:34:25 .. 15:35:43 it seems that most people agree we need to reject submissions that don't have all the tests run that must be run 15:36:03 the question is to what extent do we guide the users about this? 15:36:36 ideal would be to enforce it in checkbox, but in the short term it needs to be stated on the UF website 15:36:40 but... 15:37:19 it's important to remember that the assumption that everyone will engage with UF directly through the site is probably wrong 15:37:46 so some users may not even read the participate page 15:37:47 ... 15:38:32 o/ 15:38:46 akgraner - go ahead 15:39:22 brendand, your assumption is right - many people find out about system testing on their computer, run the test, *then* find out about the site 15:40:03 and I know a people who never read documentation (sadly) but it does happen... 15:40:22 o/ 15:40:32 * brendand points to self 15:40:38 cr3 - your turn 15:41:13 it would be nice to see the number of submissions to launchpad before and after ubuntu friendly was announced, there was already a large number of submissions coming in before probably from people just discovering checkbox and running it for the heck of it 15:41:17 .. 15:42:39 cr3 - indeed 15:43:44 brendand: modulo hardware certification submissions, of course :) 15:45:14 brendand: is it time for aob? 15:45:32 seems like everyone is done on this topic 15:45:36 #topic AOB 15:46:17 AOB? 15:46:31 o/ 15:47:12 cr3 - yep 15:47:13 checkbox needs tests for gremlins because they've been misbehaving in my computer lately :) 15:47:16 .. 15:47:42 #action cr3 to add gremlins/detect to checkbox 15:47:42 * meetingology cr3 to add gremlins/detect to checkbox 15:48:10 "Does your gremlin have a shiny coat, and I don't mean a pimp coat!" 15:49:12 o/ 15:49:51 roadmr - is this still about gremlins? 15:49:58 nope 15:50:07 roadmr - then please, go ahead :) 15:50:28 heheh, just wanted to get a feel for how useful people think the number of "raters" is in the UF front page 15:50:58 between 1 and 10, I'd say 11 :) 15:51:11 ... basically just that, should that information be available at a glance, or is it ok to move it to the system's detail page for instance? 15:51:14 .. 15:52:31 roadmr - i think it depends on whether people are using it the way we imagine they would (i.e. to get an idea of how 'reliable' the results are) 15:53:54 hmm maybe at some point we could conduct a poll on how useful people visiting UF think each bit of information is 15:54:06 maybe at UDS? 15:54:39 I was thinking something like those "would you like to answer a poll to help improve our site?" - to get a feel from normal, average users 15:54:57 I think the UDS crowd may be too biased towards preferring a lot of information :) 15:55:15 roadmr: maybe it can be used as an excuse to make people aware of the site in the first place 15:55:31 roadmr - i hate those things :) but yeah, you're right it should be from normal site users 15:55:34 roadmr: although, since ara will be making a presentation, everyeone will innevitably know about it. maybe ask her to announce the poll? 15:55:36 cr3: that too! and to encourage a bit more participation, which is always good 15:56:28 I think that deserves an action item for ara if we all agree a poll would be useful. I don't see how it could hurt 15:57:08 from the little usability testing I've done, it's been tremendously useful 15:57:35 if anyone intends to prepare some usability testing sessions, you might like to read: Rocket Surgery Made Easy 15:58:21 we'd have to have the poll ready before UDS (i.e. in about 10 days) 15:58:21 ... and don't let jedimike run the session otherwise he'll jedi mind trick everyone to say what he wants to hear :) 15:59:04 who'd like to prepare some questions? 16:00:07 considering we're nearly out of time, i propose cr3 16:00:29 brendand: I thought jedimike would be better placed for writing the questions, no? 16:00:43 jedimike - do you want to do that? 16:01:01 brendand: yeah 16:01:24 #action jedimike to prepare a small usability survey about the site 16:01:24 * meetingology jedimike to prepare a small usability survey about the site 16:01:37 ok, i think our time here is up 16:01:49 thanks everyone for your participation 16:01:59 #endmeeting Generated by MeetBot 0.1.5 (http://wiki.ubuntu.com/AlanBell/mootbot)