From nobody@FreeBSD.org  Wed Nov  3 21:00:58 2004
Return-Path: <nobody@FreeBSD.org>
Received: from mx1.FreeBSD.org (mx1.freebsd.org [216.136.204.125])
	by hub.freebsd.org (Postfix) with ESMTP id 1033416A4D6
	for <freebsd-gnats-submit@FreeBSD.org>; Wed,  3 Nov 2004 21:00:57 +0000 (GMT)
Received: from www.freebsd.org (www.freebsd.org [216.136.204.117])
	by mx1.FreeBSD.org (Postfix) with ESMTP id ED86143D64
	for <freebsd-gnats-submit@FreeBSD.org>; Wed,  3 Nov 2004 21:00:14 +0000 (GMT)
	(envelope-from nobody@FreeBSD.org)
Received: from www.freebsd.org (localhost [127.0.0.1])
	by www.freebsd.org (8.12.11/8.12.11) with ESMTP id iA3L09cC026537
	for <freebsd-gnats-submit@FreeBSD.org>; Wed, 3 Nov 2004 21:00:09 GMT
	(envelope-from nobody@www.freebsd.org)
Received: (from nobody@localhost)
	by www.freebsd.org (8.12.11/8.12.11/Submit) id iA3L09sc026536;
	Wed, 3 Nov 2004 21:00:09 GMT
	(envelope-from nobody)
Message-Id: <200411032100.iA3L09sc026536@www.freebsd.org>
Date: Wed, 3 Nov 2004 21:00:09 GMT
From: jim feldman <secmgr@jim-liesl.org>
To: freebsd-gnats-submit@FreeBSD.org
Subject: gvinum can't init raid5 set
X-Send-Pr-Version: www-2.3

>Number:         73499
>Category:       kern
>Synopsis:       gvinum can't init raid5 set
>Confidential:   no
>Severity:       non-critical
>Priority:       low
>Responsible:    le
>State:          closed
>Quarter:        
>Keywords:       
>Date-Required:  
>Class:          sw-bug
>Submitter-Id:   current-users
>Arrival-Date:   Wed Nov 03 21:10:32 GMT 2004
>Closed-Date:    Sat Nov 27 15:12:03 GMT 2004
>Last-Modified:  Sat Nov 27 15:12:03 GMT 2004
>Originator:     jim feldman
>Release:        5.3 RC2
>Organization:
>Environment:
FreeBSD greybrd.xxx.xxx.net 5.3-STABLE FreeBSD 5.3-STABLE #0: Tue Nov  2 03:52:27 MST 2004     root@greybrd.xxx.xxx.net:/usr/obj/usr/src/sys/GREYBRD  i386

>Description:
had working raid 5 set under 5.3rc1 composed of 4 drives (scsi).  I updated using cvsup and the 5.3-RELENG tag.  after make buildworld && make buildkernel && make installkernel reboot to single, make installworld, mergemaster, I rebooted

all the sub disks in the raid 5 plex showed stale.  If I use 
"gvinum rm -r volname", it deletes what it should.  If I re-create the set, it re-creates the set and the sub disks are still stale.
>How-To-Repeat:
      create raid 5 plex based volume with gvinum under rc1.  update to rc2 and watch plex become corrupt and unfixable
>Fix:
      
>Release-Note:
>Audit-Trail:
Responsible-Changed-From-To: freebsd-i386->le 
Responsible-Changed-By: simon 
Responsible-Changed-When: Wed Nov 3 23:01:43 GMT 2004 
Responsible-Changed-Why:  
Over to gvinum author. 

http://www.freebsd.org/cgi/query-pr.cgi?pr=73499 
State-Changed-From-To: open->feedback 
State-Changed-By: le 
State-Changed-When: Sat Nov 6 15:22:07 GMT 2004 
State-Changed-Why:  
Have you tried to 'start <raid5volume>' after you created it? 

http://www.freebsd.org/cgi/query-pr.cgi?pr=73499 

From: secmgr <security@jim-liesl.org>
To: freebsd-gnats-submit@FreeBSD.org, secmgr@jim-liesl.org
Cc:  
Subject: subject=Re:kern/73499
Date: Thu, 11 Nov 2004 11:07:57 -0700

 can close out with user error/poor documentation
State-Changed-From-To: feedback->closed 
State-Changed-By: le 
State-Changed-When: Sat Nov 27 15:11:42 GMT 2004 
State-Changed-Why:  
Closed on submitter's request - pilot error due to missing documentation. 

http://www.freebsd.org/cgi/query-pr.cgi?pr=73499 
>Unformatted:
