Trying to get a 4x4 keypad working with a PIC 18f4685.
I've turned on weak-pulls ups. Set the appropriate pins to either input/output but when I send a signal out I'm not getting it back on bits 6 & 7. It just gets zeros...
I've tried to debug using the PicKit3 but seems that it uses RB7 and crashes things when a button for that row is pushed. Of course that tells me that the signals must be getting through, to a point.
Is there anything else in particular that I need to set up in configuration for PORT B?
As always...your help is greatly appreciated.
Since the EE site so rudely shut you down before you could get an answer, I figured I would come here to answer your question.
Check table 10-3 on page 135, it lists all capabilities of port B pins. Note that RB6 and RB7 are also the debugging pins, so I wouldn't use these.
Also, are you writing to LATx and reading from PORTx? It's important to do this when reading and writing to the same port. If you read and write to PORTx, you can accidentally read a stale value from an output that has not had enough time to change yet, and your next write will obliterate your intended value. This is particularly pernicious on PICs that don't have a LATx register; any operation, even bit-wise operations like BSF/BCF, will do a read-modify-write of the ENTIRE port register, affecting more than the bit that you intended to modify. See the answer to this EE question: https://electronics.stackexchange.com/questions/28744/interfacing-a-keypad-with-a-microcontroller
Not sure of exactly your schematic (a sketch might help) but a common error in PIC GPIO is not setting the ADC registers to digital inputs. They come out of reset as analog inputs.
Look at register description 19-2 in the PIC18F4685 Datasheet.
ADCON1 comes out of reset as 0x00. To set all the analog pins to digital I/O, PCFG3:0 need to be set to 1.
ADCON1bits.PCFG = 0x0F;
Can you show use your code for setting the tristate registers (TRISB) and how you are reading. Have you checked the voltages at the input pins with a digital multimeter (DMM) before and during the button press? They are $10 and worth it.
Finally, did you disable the analog pins? On PIC24 chips you have to do:
AD1PCFG = 0xFFFF
before digital input reads will work. Might be the same on your chip.
Can you give us the EXACT model number of your chip?
Related
We know that computer performs its all operations in binary only. It cannot work with decimal or any other base numbers.
If computer cannot perform any operation in decimals, then how does it convert them to binary? I think there are different stages during the conversion at which addition and multiplication are required. How the computer can add or multiply any number even before getting its binary equivalent ?
I have searched this in many place but couldn't find a convincing answer.
Note: This stackexchange site is not the right place to ask this question. I am still answering it, better shift it to the appropriate one or delete question after getting your answer.
Well it doesn't care what input you supply to it. Think of it as your tv switch. When you switch it on, your tv starts to work. This happens because it got the exact current flow it required to work. Similarly in a computer, there is a particular voltage, let's say 5V. Below 5V is all considered what you call as '0' otherwise '1'. You may have seen an AND, OR etc gates. If you supply both '1' to AND it results in '1' otherwise '0'. There are many such digital circuits. Some examples are a binary adder, a latch, a flip flop etc. These work with these current signals (which are characterised as 0 or 1 as explained above). A computer is a combination of millions of such circuits.
When you talk about converting decimal to binary or something like that, its actually not like that. Every program (spreadsheets, games etc) are written in some language. Most common ones are compiled or interpreted. Some languages that get compiled are C, javaetc. and some interpreted ones are python, rubyetc. Job of a compiler or interpreter if to convert the code you wrote in that language to assembly code as per the rules of that language. Assembly code is then converted into machine code when it has to run. Machine code is pure zeros and ones. These zeros and ones just define triggers on what to execute and when.
Don't confuse this with what you see. Desktop that displays you the data is a secondary thing that is specifically made just to make things easy for us.
In a computer a clock keeps running. Like you must have heard 2.5Ghz processor or something like that. This is the frequency with which instructions are executed. Seems odd but yes whether you are doing work or not, when computer is working, it executes instructions continuously and if you are not doing anything it keeps on checking for interaction.
Imagine correctly
1) you opened your pc, the hardware got ready for your commands and kept checking for interaction
2)you opened a folder. Now think to open a folder you obviously need to touch the keyboard, mouse or do some voice interaction. This interaction is followed by your computer. Pressing a down arrow produces a zero or one signal at the right place. Now after this it gets displayed to you. It is not that what is being displayed is being done. Instead what is being done is getting displayed for you to follow it easily.
I do not have experience with micorcontrollers but I have something related to them. Here is and explanation of my issue:
I have an algorithm, and I want to calculate how many cycles my algorithm would cost on a specific avr microcontroller.
To do that I downloaded AVR-STudio 6, and I used the simulator. I succeeded in obtaining the number of cycles for my algorithm. What I wan to know is that how can I make sure that my algorithm is working as it should be. AVR-Studio allows me to debug using the simulator but I am not able to see the output of my algorithm.
To simplify my question, I would like some help in implementing the hello world example in AVR-Studio, that is I want to see "hello world" in the output window, if that is possible.
My question is not how to program the microcontroller, my question is that how could I see the output of a program in AVR-Studio.
Many thanks
As Hanno Binder suggested in his comment:
Atmel Studio still does not provide any means to display debug messages sent by the program simulated. Your only option is to place breakpoints at apropriate locations and then inspect the state of the device in the simulator. For example the locations in RAM where your result is stored, or the registers in which it may reside; maybe have a 'watch' set on a variable or expression.
I think this is the best answer, watch vairables and memory while in debug mode.
Note: turn off optimization when you want to debug for infomation, or some variables will be optimized away.
the best thing to test if algorithms work is to run them in a regular PC program and feed them with data and compare the results with ground trouth.
Clearly to be able to do this a good programming style is neccessary that separates hardware related tasks from the actual data processing. Additionally you have to keep architectural differences in mind (eg: int=16bit vs. int=32bit --> use inttypes.h)
I'm entirely new to LabView, and as a pet project, I'm trying to recreate a pulse detector. Thing is, the version of the .VI is LabView2010, and I can't open the.VI in LabView2009, so were trying to remake it by looking at the module. I do however, have the image, but since I'm pretty new, I can't identify some of the components used. Below is an image of the .VI, as well as, the parts I don't know encircled with red and enumerated. What exactly are these? Thanks!
To make a shift register, right click on the edge of the while loop and place a shift register. The Wait (ms) node is found in the timing functions pallet. #1 and #3 are found in the waveform generation pallet. And #2 is a waveform graph that is bound to the output of the filter. Just right click on the output of the filter and create an Indicator
I only have limited experience with the specific features in this code, so I don't have exact names, but it should point you in the right direction:
2 is a dynamic data indicator.
1 converts it to a waveform (it probably appears automatically if you hook up a DDT wire to a WF function.
3 unbundles the data from the waveform. It should be in the waveform palette.
4 is a shift register.
5 is a wait function.
In general, I would recommend that you get to learning, as you will need to understand these things to at least that level before you can be proficient.
Also, the NI forums are much more suitable for this type of question and they have many more users. I would suggest if you have such questions which you can't answer yourself, then post them there.
Ok so I've recently started doing some reverse engineering, and I keep coming across a term (I think) I have no idea what it means? A badboy?
00013F92 7E 24 JLE SHORT function.00013FB8 ; badboy
Could anyone explain?
Maybe this is the answer:
http://www.codeproject.com/Articles/30815/An-Anti-Reverse-Engineering-Guide
Search for "bad boy".
Let me paste that in, four and a half years after the fact, to satisfy the moderator:
There are three types of breakpoints available to a reverse engineer:
hardware, memory, and INT 3h breakpoints. Breakpoints are essential to
a reverse engineer, and without them, live analysis of a module does
him or her little good. Breakpoints allow for the stopping of
execution of a program at any point where one is placed. By utilizing
this, reverse engineers can put breakpoints in areas like Windows
APIs, and can very easily find where a badboy message (a messagebox
saying you entered a bad serial, for example) is coming from. In fact,
this is probably the most utilized technique in cracking, the only
competition would be a referenced text string search. This is why
breakpoint checks are done over important APIs like MessageBox,
VirtualAlloc, CreateDialog, and others that play an important role in
the protecting user information process. The first example will cover
the most common type of breakpoint which utilizes the INT 3h
instruction.
I am practicing reversing skill using OLLdbg under windows.
there is an interactive window asking you input, let's say "serial number". My question is when user operate on the window, it is hard to locate related data flow within the debugger window. For example, if I click "F9", we can view the instruction flow; but When inputing on the window, I can't know which instructions have been executed.
My target is to find some jump instruction and change it, so that I can bypass the correct input requirement. I think the instruction should be quite close to instruction related to arg#, and related to TEST command.
Looking for hint or trick. Thanks.
One thing you could do is type something in the text field and then use an application such as Cheat Engine to find out where in the memory these characters are stored. Then you can put a memory (on access) breakpoint on the address of the first character in ollydbg. Then press the button that verifies the serial. When an instructions accesses this part of the memory it will break. You're inside a part of the code that verifies your string. Now from here you have to try to understand what the code is doing to find the instruction you want to alter.
Depending on how secure the application is, this will work. With a more secure application this most likely won't work. When your just starting reverse engineering I suggest you find some easy applications made for cracking and work your way to the more secure applications. A site where you can find many of these "crackmes" is crackmes.de. Also i can suggest lene151's tutorials here. Some of the best tutorials I've seen on reverse engineering.