Advanced search  

News:

cpg1.5.48 Security release - upgrade mandatory!
The Coppermine development team is releasing a security update for Coppermine in order to counter a recently discovered vulnerability. It is important that all users who run version cpg1.5.46 or older update to this latest version as soon as possible.
[more]

Pages: [1]   Go Down

Author Topic: duplicate prevention/view in a manner to prevent duplicates  (Read 4167 times)

0 Members and 1 Guest are viewing this topic.

faptastic

  • Coppermine novice
  • *
  • Offline Offline
  • Posts: 21
duplicate prevention/view in a manner to prevent duplicates
« on: October 12, 2005, 04:38:55 am »

I'd like to see some sort of duplicate prevention; such as this:
when a user adds an image, the file size and type is checked against a table in the DB of all the other file sizes and types, which contains IDs of other added files.

also, to prevent deleted images from showing up, each image also deletes itself from this "index of sizes" of sorts, so it won't show up.

The idea is that when an image with a similar file size and file type are added, the user is shown thumbnails of other potentially matching images so they can determine if they are the same or not. if there is a match, the user can just click "Discard", or likewise, and move on to the next image they have chosen to upload.

Just an idea...
Logged

artistsinhawaii

  • VIP
  • Coppermine addict
  • ***
  • Country: us
  • Offline Offline
  • Gender: Male
  • Posts: 856
    • evj art and photography
Re: duplicate prevention/view in a manner to prevent duplicates
« Reply #1 on: October 12, 2005, 06:17:32 am »

faptastic,

I really don't see how this would work. Like many others here, I use the same camera for my pictures.  So every photo is already the same size.  I then batch resize all of my photos from their original size to the maximum size I want on mysite -- in my case, 1000 x 667 pixels.  So virtually every photo that is uploaded to my site is already the same size. What you are suggesting is that I have to scroll through every previously uploaded picture on my site to determine if it has already been uploaded.  Too much work for me.

Coppermine already searches for same file name, which I feel is adequate.  If by chance I see two of the same pics on my site, I know that I have duplicates on my harddrive that need to be addressed and the improper one deleted. If it rejects the filename, I search my harddrive to see what other file has the same name and do changes appropriately.  This is like a bonus feature for me.  ;) 

Dennis
Logged
Learn and live ... In January of 2011, after a botched stent attempt, the doctors told me I needed a multiple bypass surgery or I could die.  I told them I needed new doctors.

Joachim Müller

  • Dev Team member
  • Coppermine addict
  • ****
  • Offline Offline
  • Gender: Male
  • Posts: 47843
  • aka "GauGau"
    • gaugau.de
Re: duplicate prevention/view in a manner to prevent duplicates
« Reply #2 on: October 12, 2005, 07:53:15 am »

I can see the need for such a feature if people have a "funnypics" or "babes" gallery, i.e. a gallery where users can upload pics they do not own. There are many duplicates on sites like that, and you could think of places where the use of something like that could come handy even with legitimate use of other's pics, e.g. a gallery that hosts free graphical resources like icons. You could accomplish this by creating a hash for each file that gets uploaded and store this checksum in the database. On upload, the checksum of the file that is meant to be uploaded could be compared against the hashes of pics that already exist in the gallery. However, I can only see a very small number of users who might find this feature helpful, so it's rather a mod/plugin candidate than a feature that will go into the core code.
Logged

faptastic

  • Coppermine novice
  • *
  • Offline Offline
  • Posts: 21
Re: duplicate prevention/view in a manner to prevent duplicates
« Reply #3 on: October 12, 2005, 09:02:23 am »

I can see the need for such a feature if people have a "funnypics" or "babes" gallery, i.e. a gallery where users can upload pics they do not own. There are many duplicates on sites like that, and you could think of places where the use of something like that could come handy even with legitimate use of other's pics, e.g. a gallery that hosts free graphical resources like icons. You could accomplish this by creating a hash for each file that gets uploaded and store this checksum in the database. On upload, the checksum of the file that is meant to be uploaded could be compared against the hashes of pics that already exist in the gallery. However, I can only see a very small number of users who might find this feature helpful, so it's rather a mod/plugin candidate than a feature that will go into the core code.

Yeah. I understand. Well, I can crack out my PHP book, and see if i can manage something x.x
Logged

Stramm

  • Dev Team member
  • Coppermine addict
  • ****
  • Country: 00
  • Offline Offline
  • Gender: Male
  • Posts: 6006
    • Bettis Wollwelt
Re: duplicate prevention/view in a manner to prevent duplicates
« Reply #4 on: October 12, 2005, 01:29:13 pm »

you could create a checksum.. but if the pic got recompressed (and the funny pics are that widespread in the web that there are literally uncountable versionions of the same image)  it's another checksum and it won't work

faptastic

  • Coppermine novice
  • *
  • Offline Offline
  • Posts: 21
Re: duplicate prevention/view in a manner to prevent duplicates
« Reply #5 on: October 12, 2005, 05:30:30 pm »

you could create a checksum.. but if the pic got recompressed (and the funny pics are that widespread in the web that there are literally uncountable versionions of the same image)  it's another checksum and it won't work

Yeah, I know, but the idea is that it'd weed out at least some of them.
And rather then do a hash, I was thinking of sorting by height to width ratio, file type, and size, as well as exif or other metadata.

A hash would be easier and quicker, but it wouldn't be as effective, I feel.
Logged
Pages: [1]   Go Up
 

Page created in 0.035 seconds with 20 queries.