PDA

View Full Version : Wanted: Testing and feedback of Duplicate Master addin V2.0



brettdj
11-19-2009, 06:15 PM
Hi All,

I have made some significant upgrades to the Duplicate Master addin, mainly on string comparison where the tool now offers
case insensitive searches
ignore all whitespace searches (ASCII 9-13,32,160)
apply the worksheet CLEAN and/or TRIM functions,and for those who want some serious matching capability
Regular Expression functionality (for example treat all numeric strings as "x", treat fred/frederick/freddy/fredy as fred etc)The addin processing logic has also been optimised significantly and should be much quicker if running on all sheets on very large files

Lastly I have added an option to exclude any items from a unique list if they occur more than once

I would greatly appreciate any testing and feedback so that I can finalise this beta version. I am also open to including additional functionality

Attached in the addin in xls format. For those running XL2003 it will add a new menu item to the Tools commandbar

Much appreciated :beerchug:

Dave

<prior version of file removed. see bottom of thread>

Oorang
11-25-2009, 01:20 PM
Hi Dave,
Could you be a little more specific about what kind of feed back are you looking for? Things you would like tested, etc.

brettdj
11-26-2009, 06:01 PM
Sure,

Mainly the new fucntionality with respect to string comparisons.

Does it produce the expected outputs
Is it error proof
Is the layout ok and intutitve to use
Are the new string functions useful. Should I add anything elseCheers

Dave

Oorang
11-30-2009, 09:20 AM
May I have your permission to re-factor the code?

brettdj
12-05-2009, 05:13 AM
> May I have your permission to re-factor the code?

Can you pls explain what you mean? I am happy for you to suggest changes, or to use the code for your own purposes

The version I posted above had a silly UserForm glitch. I will repost shortly

mweinman
01-13-2010, 12:00 PM
Bit of incompetence here but I cannot figure out how to actually use this. I download the file and it opens Excel, i allow macros, and click start me but it only scans the page it is on and I cannot figure out how to save it or load it for use on another worksheet. Please help.

brettdj
01-14-2010, 09:57 PM
Hi,

I added this file as an unprotected xls file.

If you are running Xl2003 then you will have a new menu option under 'Tools'.

Attached is the xla version, which will be easier to run in xl2007 (under Addins)

Cheers

Dave

perripops
01-19-2010, 11:14 PM
great add on to excel!!! i has around 600,000 cells i needed to get all the duplicates out it took about 48hrs of formatting but all good!! would be great if it worked a bit quicker though but thanks anyway, couldnt ask for anything better for free!!!

Bob Phillips
01-20-2010, 02:56 AM
48 hours?

Couldn't youi just add a helper column with a COUNTIF formula and the filetr all values > 1?

mweinman
01-20-2010, 08:00 AM
Thank you, works great and did exactly what I needed.

brettdj
01-21-2010, 11:42 PM
48 hours?

The code uses variant arrays (rather than a time intensive looping through ranges) so I am struggling to see why it would have taken that long. Were you colouring in all duplicate cells on a cell by cell basis

I'd like to see your file if possible, and what setting you ran.

brettdj
01-21-2010, 11:46 PM
Attached is V2.1

I have made one (last?) update so split out the unique row reports into their original condition on the output sheet.

In doing so I rediscovered a problem and a fix for working with variant arrays and range with long strings in Xl2003. I would have saved some time if I had read this first http://www.mrexcel.com/forum/showthread.php?t=64613.

It is self defeating to use single cell writes for variant arrays as it removes the time saving desired in the first place. oh well.

Happily this is not an issue for Xl2007

Any future changes will be listed at experts-exchange in this article to minimise the double-up

http://www.experts-exchange.com/articles/Software/Office_Productivity/Office_Suites/MS_Office/Excel/Eliminating-duplicate-data-with-Duplicate-Master-V2.html

Regards

Dave

perripops
02-09-2010, 12:27 AM
48 hours?

The code uses variant arrays (rather than a time intensive looping through ranges) so I am struggling to see why it would have taken that long. Were you colouring in all duplicate cells on a cell by cell basis

I'd like to see your file if possible, and what setting you ran.

Hi Brett, thanks for your help. It actually didnt work excel kinda freezes (not responding) up maybe coz its such a big file. can i email you the file and see if it works for you? i need either the douple ups deleted or highlighted or a unique list created.. your help would be much appreciated!!

perripops
02-09-2010, 12:46 AM
48 hours?

Couldn't youi just add a helper column with a COUNTIF formula and the filetr all values > 1?
sorry i thought i was experienced with excel but what you said makes no sense to me. i have tried the unique record function on excel but it only does 1 column. i have about 30 columns with upto 50,000 rows per column

brettdj
02-20-2010, 09:29 PM
please send it to me
ozbrettdjnospam
at
yahoo
dot
com
(remove nospam)