Can anyone advise the character limits on ens sub domains? I a looking to know a min and max character limit (ens .eth addresses for example have to be at least 3 characters).
Am I able to have 1.domain.eth where I own domain.eth for example?
I've searched and am unable to find an answer. Thanks.
Currently for subnames, the min is one character and there is no max. It is handled by turning the label into a hash and then combining that hash with it's parent name's hash.
For the upcoming Namewrapper contract subnames will have a maximum of 255 bytes.
Related
I want to know if BigInt is enough in size.
I have created a registration.php where the user gets emailed an account activation link to click to verify his email so his account gets activated.
Account Activation Link is in this format:
[php]
$account_activation_link =
"http://www.".$site_domain."/".$social_network_name."/activate_account.php?primary_website_email=".$primary_website_email."&account_activation_code=".$account_activation_code."";
[/php]
Account Activation Code is in this format:
$account_activation_code = sha1( (string) mt_rand(5, 30)); //Type Casted the INT to STRING on the 1st parameter of sha1 as it needs to be a STRING.
Now, the following link got emailed:
http://www.myssite.com/folder/activate_account.php?primary_website_email=my.email#gmail.com&account_activation_code=22d200f8670dbdb3e253a90eee5098477c95c23d
Note the account activation code that got generated by sha1:
22d200f8670dbdb3e253a90eee5098477c95c23d
But in my mysql db, in the "account_activation_code" column, I only see:
"22". The rest of the activation code is missing. Why is that ?
The column is set to BigInt. Is not that enough to house the Sha1 generated code ?
What is your suggestion ?
Thank You
Hashing methods like SHA-1 produce binary values that are on the order of 160+ bits long depending on the variant used. The common SHA256 one is 256 bits long. No cryptographic hash will fit in a 64-bit BIGINT field because 64-bit hashes are uselessly small, you'll have nothing but collisions.
Normally people store hashes as their hex-encoded equivalents in a VARCHAR(255) column. These can be indexed and perform well enough in most situations, especially one where you do periodic lookups based on clicks. From a performance and storage perspective there's no problems here.
Short answer: BIGINT is way too small.
A hash is basically a stream of bits (160 bits in the case of SHA-1). While it's certainly possible to render those bits as a base 2 number and convert it to base 10, you need a really big storage to do so (as far as I know it's not common to see integer variables larger then 64 bits) and there aren't obvious advantages. BIGINT is a 64-bit type, thus cannot do the job.
Unless you have a good reason to store it as number, I'd simply go for either a binary column type or its plain-text hexadecimal representation in a good old VARCHAR (the latter tends to be more practical to handle).
You are trying to store a string in a BigInt. That is your issue. SHA hashes are a mix of alphanumeric characters not just numbers. Change the field to a VARCHAR and you'll be fine
I have had to work in a project where we have an identifier in HEX.
Example, B900001752F10001, is received in a parser developed in JAVA in a SIGNED LONG variable. We store that variable in a SIGNED BIGINT variable in MySQL DB.
Every time we need the HEX Chain we use HEX(code) function and we get what is expected.
But when we have to provision the master table, we need to input valid codes, to achieve that we used something like:
Update employee set code=0xB900001752F10001 where main_employee_id=1002;
it worked in the past producing code to be stored in DB as
13330654997192441857
but now we are using the same exact instruction and we are getting code stored in DB as
-5116089076517109759
So Comparing those two numbers by using HEX function, those provide the same HEX NUMBER.
select HEX(-5116089076517109759), HEX(13330654997192441857)
0xB900001752F10001, 0xB900001752F10001
Could someone please provide ideas why this is happening? How we should handle this from the provisioning perspective we need to assure storing as 13330654997192441857 so when an authentication event happen codes match.
I have run without any other idea, I appreciate any help.
I think you have overflowed the datatype.
According to MySQL manual, signed bigint is in the range of
-9,223,372,036,854,775,808
to
9,223,372,036,854,775,807
Your number
18,446,744,073,709,551,615
has exceeded the above positive bound so it overflows and is
interpreted as a negative number.
Having that said, I think you may still be okay with your command -- it is only when you try to interpret the hex pattern as a number the result looks confusing.
Update employee set code=0xB900001752F10001 where main_employee_id=1002;
64bite machineļ¼top digit is sign bit,so the biggest num is 9,223,372,036,854,775,807,but if ur num more than it,the top digit will transform to be 1,so the num will be negative and its overflowed.
so ur 13330654997192441857 will become to 5116089076517109759.
What is the best dataype in MySQL for title number for example X.X.X (4.3.1)?
Thanks.
varcar(10) should be sufficient. Your "4.3.1" is 5 digits, so 10 characters would support an additional 5 characters. E.g. "14.22.1740"
Depending on how large your version numbers might grow to (and then add a few characters to account for potential future needs) will depend on the length of the storage. Google Chrome's current version (or at least, the one I'm running) is "47.0.2526.106 m" 15 characters.
Comparing these starts to get dicey, which appears to be your main problem.
But comparing version numbers has always been dicey.
At that point, I would suggest using 3 integer fields "main","major", and "minor" and sorting based on all three.
ORDER BY main DESC, major, minor
I'm working on a project that needs to store something like
101110101010100011010101001
into the database. It's not a file or archive: it's only a bit array, and I think that storing it into a varchar column is waste of space/performance.
I've searched about the BLOB and the VARBINARY type. But both of then allows to insert a value like 54563423523515453453, that's not exactly a bit array.
For sure, if I store a bit array like 10001000 into a BLOB/varbinary/varchar column, it will consume more than a byte, and I want that the minimum space is consumed. In the case of eight bits, it needs to consume only one byte, 16 bits two bytes, and so on.
If it's not possible, then what is the best approach to waste the minimum amount of space in this case?
Important notes: The size of the array is variable, and is not divisible by eight in every situation. Sometimes I will need to store 325 bits, other times 7143 bits....
In one of my previous projects, I converted streams of 1's and 0' to decimal, but they were shorter. I dont know if that would be applicable in your project.
On the other hand, imho, you should clarify what will you need to do with that data once you get it stored. Search? Compare? It might largely depend on the purpose of the database.
Could you gzip it and then store it? Is that applicable?
Binary is a string representation of a number. The string
101110101010100011010101001
represents the number
... + 1*25 + 0*24 + 1*23 + 0*22 + 0*21 + 1*20
As such, it can be stored in a 32-bit integer if were to be converted from a binary string to the number it represents. In Perl, one would use
oct('0b'.$binary)
But you have a variable number of bits. Not a problem! Just process them 8 at a time to create a string of bytes to place in a BLOB or similar.
Ah, but there's a catch. You'll need to add padding to get a number divisible by 8, which means you'll have to use a means of removing that padding. A simple approach if there's a known maximum length is to use a length prefix. e.g. If you know the number of bits is never going to exceed 65,535, encode the number of bits in the first two bytes of the string.
pack('nB*', length($binary), $binary)
which is reverted using
my ($length, $binary) = unpacked('nB*', $packed);
substr($binary, $length) = '';
I created a table in MS Access 2013 with only one column of "Long Text" type (called as Memo earlier) and made it the primary key of the table. I stored a long string of 255+ characters and then I tried to store another string whose first 255 characters were same as previous stored string but all other characters after first 255 were different and MS Access gave "duplicate data" error. In the new string I changed the characters that were after 255th position, using different combinations of characters and all gave error. But when I change any character before the 255th position it does not give any error. So, I concluded that MS Access checks only the first 255 characters of "Long Text" data type for checking duplicates in that column. Is it so? What else could be reason?
String Stored of 256 characters:
LoremIpsumissimplydummytextoftheprintingandtypesettingindustryLoremIpsumhasbeentheindustrysstandarddummytexteversincethe1500swhenanunknownprintertookagalleyoftypeandscrambledittomakeatypespecimenbookIthassurvivednotonlyfivecenturiesbutalsotheleapintoelectr
String Gave Error:
LoremIpsumissimplydummytextoftheprintingandtypesettingindustryLoremIpsumhasbeentheindustrysstandarddummytexteversincethe1500swhenanunknownprintertookagalleyoftypeandscrambledittomakeatypespecimenbookIthassurvivednotonlyfivecenturiesbutalsotheleapintoelect1
String Gave Error:
LoremIpsumissimplydummytextoftheprintingandtypesettingindustryLoremIpsumhasbeentheindustrysstandarddummytexteversincethe1500swhenanunknownprintertookagalleyoftypeandscrambledittomakeatypespecimenbookIthassurvivednotonlyfivecenturiesbutalsotheleapintoelect2
String Gave Error:
LoremIpsumissimplydummytextoftheprintingandtypesettingindustryLoremIpsumhasbeentheindustrysstandarddummytexteversincethe1500swhenanunknownprintertookagalleyoftypeandscrambledittomakeatypespecimenbookIthassurvivednotonlyfivecenturiesbutalsotheleapintoelect123
Does Not Give Error:
LoremIpsumissimplydummytextoftheprintingandtypesettingindustryLoremIpsumhasbeentheindustrysstandarddummytexteversincethe1500swhenanunknownprintertookagalleyoftypeandscrambledittomakeatypespecimenbookIthassurvivednotonlyfivecenturiesbutalsotheleapintoelec1
Does Not Give Error:
LoremIpsumissimplydummytextoftheprintingandtypesettingindustryLoremIpsumhasbeentheindustrysstandarddummytexteversincethe1500swhenanunknownprintertookagalleyoftypeandscrambledittomakeatypespecimenbookIthassurvivednotonlyfivecenturiesbutalsotheleapintoelec2
Does Not Give Error:
LoremIpsumissimplydummytextoftheprintingandtypesettingindustryLoremIpsumhasbeentheindustrysstandarddummytexteversincethe1500swhenanunknownprintertookagalleyoftypeandscrambledittomakeatypespecimenbookIthassurvivednotonlyfivecenturiesbutalsotheleapintoelec3
Please notice the difference in the last few characters of above samples. The first stored string has 256 characters. Even if the column is not the primary key, the problem remains same if "Indexed: Yes (no-duplicates) allowed" value is set true in the table design for that column.
As #HansUp stated in the comments, Access (specifically the Jet/ACE db engine) only uses the first 255 characters of a Memo/Long Text field to create its index. Hence, it only uses the first 255 characters to enforce No Duplicates.
#HansUp's advice to use a different db engine that provides better support for long strings and Full Text search is probably the best approach, but I understand there are often other considerations that may be limiting you to solving your problem in Access.
As such, here is an Access-only approach to solving your problem. This assumes the requirement you listed in the comments is valid; i.e., you need to store unique strings of between 400 and 1000 characters.
Alternative 1
Keep your initial Memo/Long Text field: Notes
Create four text fields (not Memo/Long Text) of 250 characters max: Notes1, Notes2, Notes3, Notes4
Set all four text fields: Required -> True and Allow Zero Length -> True (this is required to ensure the unique index is enforced for strings less than 751 characters)
Create a unique index and add all four text fields to that index
Don't ignore nulls in your index
When you store the values, you will need to store them in the Notes field and also split the string among the four smaller NotesX fields
Alternative 2:
Keep your current setup and enforce the uniqueness at code level. Every time you update or insert a note, do a search on all notes that match the first 255 characters, read the value and perform the comparison in code.
Alternative 3 (thanks to #HansUp for suggesting this in the comments):
Keep your initial Memo/Long Text field: Notes
Create a 16 or 32 character text field to store the 256 bit or 512 bit hash of your long text: NotesHash
Add a unique index to your NotesHash field
Every time the memo field is changed, re-compute the hash value and attempt to store it in the table
Notes for this method:
As the pigeonhole principle easily proves, there is the possibility that two different strings will generate the same hash (a collision). However, using a good hashing algorithm will make the actual probability approach zero.
This site offers some VB6/VBA/VBScript implementations of various hashing algorithms. I can't vouch for their correctness, but they passed the eye test for me. Use at your own risk, but it's at least a good starting point.
Really, you can use any deterministic function that returns a string of 255 characters or fewer given an arbitrarily large input. The difference between a crappy hash algorithm and a good one is how well it minimizes collisions. For that reason, I would suggest you use one based on a popular standard.
And yes, I still highly recommend #HansUp's solution to simply use a different db engine.