Skip to content

Conversation

@Hananel-Hazan
Copy link
Collaborator

Noted by @Devdher
Clamping make the random values to be more populated my wmax and wmin values.

@Hananel-Hazan Hananel-Hazan requested a review from djsaunde May 25, 2018 22:19
Copy link
Collaborator

@djsaunde djsaunde left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good change!

wmax=wmax,
norm=norm,
decay=0.8),
decay=0),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn’t decay be None by default?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should but not on Diehl and Cook implementation
I'm still searching for the good one...


self.w = kwargs.get('w', torch.rand(*source.shape, *target.shape))
self.w = torch.clamp(self.w, self.wmin, self.wmax)
#self.w = torch.clamp(self.w, self.wmin, self.wmax)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove this commented out line.


self.w = kwargs.get('w', torch.rand(*source.shape, *target.shape))
self.w = torch.clamp(self.w, self.wmin, self.wmax)
self.w = self.wmin + self.w*(self.wmax-self.wmin)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, this only makes sense if the weights are randomly initialized. This should only happen if the weight aren’t passed in; otherwise, the clamping should still take place.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the user pass its weight and it need to be clamping, that a problem. It will bias the weight matrix and the user need to be notify...

print("Warning: provided weights matrix contain values smaller then :", self.wmin)
bo = True
if bo:
print("Warning: the weights matrix as been clamp between: ", self.wmin," to ", self.wmin," \n The matrix values can be bais to max and min values!!!")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you're going to use warnings, using warnings.

torch.rand(*source.shape, *target.shape)
self.w = self.wmin + self.w*(self.wmax-self.wmin)
else:
bo = False
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a mess; I'm okay with the logic, but I'm going to refactor this once it's merged.

@djsaunde
Copy link
Collaborator

@Hananel-Hazan there's a merge conflict in bindsnet/network/topology.py. Also, this pull request fails a test in test_monitors.py; could you check these out?

@Hananel-Hazan Hananel-Hazan merged commit 5cd197a into master May 29, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants