Hi,
due to some disk swapping I ended up with degraded raid 1. While the one disk is ok, I can read / write data to it I have trouble rebuilding the array to its original state. The problem is that the sync will never end, it will actually go over and over (from 0% to 100% and then again to 0% and than all over - this has been going over for 3 days so far). I have tried to reformat / remove any partition on "bad" disk from different PC but no luck.
Today I got frustrated and installed fun_plug so i can run some more diagnostic. By what I have learnt is that the linux partition (md0) got synced ok, so the problem seems to be somewhere in the data partition (md1). Using mdadm / mdstat I can see the array rebuilding (it gives me the same % as web interface) and i can`t see anything wrong (i will post results of these commands below).
So my question is, does anyone have any idea how to finish the sync or where to find any problems (other logs/apps to look to) ?
Thanks,
Adam
pS: i made backup the day the array got mad so i can of course reformat and copy data from backup, but i would prefer DNS to handle it on its own

root@DNS-320:/var/log# mdadm -D /dev/md1
/dev/md1:
Version : 00.90.03
Creation Time : Sat Aug 11 13:56:51 2012
Raid Level : raid1
Array Size : 973629312 (928.53 GiB 997.00 GB)
Device Size : 973629312 (928.53 GiB 997.00 GB)
Raid Devices : 2
Total Devices : 2
Preferred Minor : 1
Persistence : Superblock is persistent
Update Time : Tue Aug 14 19:42:17 2012
State : clean, degraded, recovering
Active Devices : 1
Working Devices : 2
Failed Devices : 0
Spare Devices : 1
Rebuild Status : 23% complete
UUID : 49011aa4:ae3efc70:8cf25191:d6505b70
Events : 0.1292
Number Major Minor RaidDevice State
0 8 2 0 active sync /dev/sda2
2 8 18 1 spare rebuilding /dev/sdb2
root@DNS-320:/var/log# mdadm -D /dev/md1
root@DNS-320:/var/log# cat /proc/mdstat
Personalities : [linear] [raid0] [raid1]
md1 : active raid1 sda2[0] sdb2[2]
973629312 blocks [2/1] [U_]
[=====>...............] recovery = 25.2% (245760000/973629312) finish=127.7min speed=94928K/sec
md0 : active raid1 sda1[0] sdb1[1]
530048 blocks [2/2] [UU]
unused devices: <none>