-
Notifications
You must be signed in to change notification settings - Fork 341
Fixing bug with random weights between wmax and wmin #61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
djsaunde
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good change!
bindsnet/models/__init__.py
Outdated
| wmax=wmax, | ||
| norm=norm, | ||
| decay=0.8), | ||
| decay=0), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn’t decay be None by default?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should but not on Diehl and Cook implementation
I'm still searching for the good one...
bindsnet/network/topology.py
Outdated
|
|
||
| self.w = kwargs.get('w', torch.rand(*source.shape, *target.shape)) | ||
| self.w = torch.clamp(self.w, self.wmin, self.wmax) | ||
| #self.w = torch.clamp(self.w, self.wmin, self.wmax) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove this commented out line.
bindsnet/network/topology.py
Outdated
|
|
||
| self.w = kwargs.get('w', torch.rand(*source.shape, *target.shape)) | ||
| self.w = torch.clamp(self.w, self.wmin, self.wmax) | ||
| self.w = self.wmin + self.w*(self.wmax-self.wmin) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, this only makes sense if the weights are randomly initialized. This should only happen if the weight aren’t passed in; otherwise, the clamping should still take place.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If the user pass its weight and it need to be clamping, that a problem. It will bias the weight matrix and the user need to be notify...
bindsnet/network/topology.py
Outdated
| print("Warning: provided weights matrix contain values smaller then :", self.wmin) | ||
| bo = True | ||
| if bo: | ||
| print("Warning: the weights matrix as been clamp between: ", self.wmin," to ", self.wmin," \n The matrix values can be bais to max and min values!!!") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you're going to use warnings, using warnings.
| torch.rand(*source.shape, *target.shape) | ||
| self.w = self.wmin + self.w*(self.wmax-self.wmin) | ||
| else: | ||
| bo = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a mess; I'm okay with the logic, but I'm going to refactor this once it's merged.
|
@Hananel-Hazan there's a merge conflict in |
Noted by @Devdher
Clamping make the random values to be more populated my wmax and wmin values.