minsoc: gdb set does not work
by flozn on Sep 6, 2013 |
flozn
Posts: 16 Joined: Jun 10, 2013 Last seen: Sep 22, 2020 |
||
Hello everyone :) !
The last days I ported minsoc to my Digilent Atlys board with Spartan6-lx45. While trying to get the UART example working, I recognized my gdb has some strange behaviour. Because there was no UART output (wrong adapter at minicom ;)... the right one is ttyACU instead of ttyUSB), I checked the content of the baud divisor variable: "print divisor" ... prints always zero! "print locals" ... prints always "divisor = 0" Even after the line which assigns the value (IN_CLK/(16*baudrate)). My idea was to change the divisor value manually with "set divisor=11". But "print locals" didn't show any change :/ . The funny thing is, the whole system works! That means after setting UART client program correctly the "Hello World" appeared on my screen :) . Even stepi, restarting (set $pc), list-ing, info reg and info spr work! Just the content of local variables may be not be displayed(?) correctly. The content has to be valid - otherwise the "Hello World" would not have reached my pc. Do you have an idea :) ? Thanks *alot* to all minsoc contributors! It is an amazing work of open source software! Slim, tidy and professional! Cheers Flo |
RE: minsoc: gdb set does not work
by flozn on Sep 10, 2013 |
flozn
Posts: 16 Joined: Jun 10, 2013 Last seen: Sep 22, 2020 |
||
Hey guys,
some news to my problem: According to the second thread of mine, I downloaded a fresh minsoc and made my Digilent Atlys changes. At the moment I think there is a problem in the Xilinx toolchain. Maybe some parameters for xst,par,map,... don't fit?! E.g. the parameter "-cm speed" for map is not available at Spartan 6 (at least for my config) - so I dropped it. The bitfiles of the fresh minsoc also have some strange behaviour. One bitfile reacts to gdb stepi with a step from 0x100 to 0x184. Another one runs fine until the first jump command and get lost with a invalid program counter. And another one doesn't match all constraints at build process :/ ... Maybe there is some timing problem? Do you have a tip for me? Thanks Alot! Cheers Flo ps. sorry for bad English ... (german) |
RE: minsoc: gdb set does not work
by rfajardo on Sep 11, 2013 |
rfajardo
Posts: 306 Joined: Jun 12, 2008 Last seen: Jan 6, 2020 |
||
Hi Flo,
thanks for the supporting words. Are you using release-1.0 or the development version? The development version works with the latest or1k and advanced debug system without the GDB patch. It is expected that stepping would not work under these circumstances. About your variables behavior, I don't think it ever worked. But I am not completely sure here. I would forward this question to the minsoc list and maybe you get an answer from Nathan. He is the developer of the advanced debug system that connects with GDB. Remember to take a look into the Known Bugs Section of the Advanced Debug System project page: http://opencores.org/project,adv_debug_sys . So, keep up the good work. Maybe you solve the problem and contribute the solution back to us :). Kind regards, Raul |
RE: minsoc: gdb set does not work
by flozn on Sep 20, 2013 |
flozn
Posts: 16 Joined: Jun 10, 2013 Last seen: Sep 22, 2020 |
||
Hi Raul!
I made some steps towards to a running system ;) ! The UART and ETH examples work now on my board :) ! Two facts may be the main reason for this: - OR1200 Dualport active (generic disabled) ... but then only ETH works (UART not :/ ) - Timing constraint for my clk input pin to have 100MHz frequency The "mint" system now runs fine through simulation, synthesis and real test (adv self test and sw tests)! According to this success I made the necessary modules for wishbone to access my data processing chain. The wishbone master model used from the uart block was a very nice for the verification of the register controller and the interrupts. After one day reading through the systems scripts and folders, I implanted my wishbone slave module to minsoc and wrote some software. The simulation of the whole system (or1200+software+bus+own_slave) runs fine! :) (very nice experience to access/control own hardware via software!) I noticed every single step for implanting an own slave to minsoc. After my thesis I'm looking forward to append some HowTo-stuff to the minsoc wiki. Now my problem :/ ... : As already described, the simulation runs fine! The errors/warning of my wb-slave occured during the synthesis I have fixed. Now the synthesis finishs with "all constraints met" (100mhz clk). So far so good. The remaining warning are mainly: no load, absorbed signals, not used. The bitfile containing my wb-slave passes the SRAM test but fails the OR1200 test :( . Here some outputs: [root@toshiba minsoc]$ ./test_system.sh Structs successfully initialized Initializing USB device Connected to libftdi driver. Initializing USB device Open MPSSE mode returned: OK Enumerating JTAG chain... Devices on JTAG chain: Index Name ID Code IR Length ---------------------------------------------------------------- 0: XC6SLX45 0x34008093 6 Target device 0, JTAG ID = 0x34008093 Xilinx IDCODE, assuming internal BSCAN mode (using USER1 instead of DEBUG TAP command) IDCODE sanity test passed, chain OK! *** Doing self-test *** Stall or1k - CPU(s) stalled. SRAM test: expected 11112222, read 11112222 expected 33334444, read 33334444 expected 55556666, read 55556666 expected 77778888, read 77778888 expected 9999aaaa, read 9999aaaa expected bbbbcccc, read bbbbcccc expected ddddeeee, read ddddeeee expected ffff0000, read ffff0000 expected dedababa, read dedababa SRAM test passed Testing CPU0 (or1k) - writing instructions Setting up CPU0 Starting CPU0! (and then nothing happens :( ... ) starting the normal adv bridge without testflag -t leads to the following behaviour: Devices on JTAG chain: Index Name ID Code IR Length ---------------------------------------------------------------- 0: XC6SLX45 0x34008093 6 Target device 0, JTAG ID = 0x34008093 Xilinx IDCODE, assuming internal BSCAN mode (using USER1 instead of DEBUG TAP command) IDCODE sanity test passed, chain OK! Burst read timed out. Retry count exceeded in burst read! ERROR reading DCR 0 at startup! 'max retries' Burst read timed out. Retry count exceeded in burst read! ERROR reading DCR 1 at startup! 'max retries' Burst read timed out. Retry count exceeded in burst read! ERROR reading DCR 2 at startup! 'max retries' Burst read timed out. Retry count exceeded in burst read! ERROR reading DCR 3 at startup! 'max retries' Burst read timed out. Retry count exceeded in burst read! ERROR reading DCR 4 at startup! 'max retries' Burst read timed out. Retry count exceeded in burst read! ERROR reading DCR 5 at startup! 'max retries' Burst read timed out. Retry count exceeded in burst read! ERROR reading DCR 6 at startup! 'max retries' Burst read timed out. Retry count exceeded in burst read! ERROR reading DCR 7 at startup! 'max retries' No watchpoint hardware found, HWP server not starting HWP server listening on host toshiba (0.0.0.0), port 9928, address family IPv4 JTAG bridge ready! HWP server thread running! Burst read timed out. Retry count exceeded in burst read! Burst read timed out. Retry count exceeded in burst read! Burst read timed out. Retry count exceeded in burst read! Burst read timed out. Retry count exceeded in burst read! Error while reading all registers: 'max retries' whereas the "Burst read timed out" and following messages occured during my tries due to accessing with gdb (>target remote :9999) . Do you have an idea? I have come so far - it would be really sad not to get the final system running ... Thanks alot!! Cheers Flo |
RE: minsoc: gdb set does not work
by flozn on Sep 21, 2013 |
flozn
Posts: 16 Joined: Jun 10, 2013 Last seen: Sep 22, 2020 |
||
Hey guys! I found some related information in two other threads. The user "gmessier" also tried to port minsoc to the Digilent Atlys board. Finally he may didn't find a solution?!? The first thread ends with the promise to contribute after some problems shall be fixed. 1. minsoc 1.0 adv_jtag_bridge gdb CRC Error 2. Spartan6 with minisoc The funny thing is, my customized minsoc runs fine through Ethernet and UART sw-examples and adv selftest. (Some lines above I wrote not precisely: or1200 dualport is necessary for defines ETHERNET *and* UART in minsoc_defines.v . The according bitfile behaves perfectly. I think (not 100% sure) the or1200 dualport define lead to an error if *only* the UART module is defined in minsoc_defines.v) At the moment I collected the following possible causes for the strange adv-bridge behaviour: - in syn/ : make distclean before make all (I always did that ... :/ ) - a reset error which causes the cpu (not?!) to stall -> I never took care about resetting the cpu (Just uploaded bitfile and started JTAG bridge) - clock divisor to 2 --> I use 4, which seems to be right: the or1200 core is only able to run with max. 42mhz (xst log) due to combinatorical driven clocks -.- (100mhz clkin / 4 = 25mhz sysclock)... A general info: I use the minsoc 1.0 version (not the development one) It would be great to get some tips :) !!! Cheers Flo ps. sry for bad english :/ |
RE: minsoc: gdb set does not work
by rfajardo on Sep 22, 2013 |
rfajardo
Posts: 306 Joined: Jun 12, 2008 Last seen: Jan 6, 2020 |
||
Hi Flo,
you are making good progress. Did your ported system work with adv_jtag_bridge? It is very strange that it would not work after you have included a custom module to the wishbone interconnect. Is your module a slave or a master? -If it is a master, it has to release the interconnect after some time. Otherwise, the Advanced Debug System will never be able to access the memory (through the interconnect) and will fail. You have to clear CYC and STB to release the interconnect at some point. -If it is a slave, you have to be sure that you acknowledge transactions, otherwise the transaction will lock too. I hope that gives you a starting point to look at. Best regards, Raul |
RE: minsoc: gdb set does not work
by flozn on Sep 23, 2013 |
flozn
Posts: 16 Joined: Jun 10, 2013 Last seen: Sep 22, 2020 |
||
Hey Raul!
:) I'VE MANAGED IT !!!! (: At first I checked your proposal: my wb slave generated the ACK signal very well in simulation. Then I searched a bit after OpenRISC in connection with Digilent Atlys. The orpsocv2 projekt does support the Atlys board! (/openrisc/trunk/orpsocv2/boards/xilinx/atlys/) This was my motivation to get minsoc running - finally the OpenRISC is "compatible" with the Spartan 6 mounted on the Atlys board. The central file is the or1200_defines.v. At first I tried to use the one of orpsoc, but it isn't compatible with minsoc 1.0 (it uses wishbone v3). My working minsoc, with all self tests (RAM,CPU), own WB slave, working adv/gdb (stepi,set $pc ok) and without any error/warning messages during operation, got the following or1200_defines.v configuration: `define OR1200_XILINX_RAMB16 `define OR1200_RFRAM_DUALPORT //`define OR1200_RFRAM_GENERIC //`define OR1200_NO_DC //`define OR1200_NO_IC //`define OR1200_NO_DMMU //`define OR1200_NO_IMMU `define OR1200_IC_1W_4KB `define OR1200_DC_1W_4KB Important is not to use 8KB instruction and data cache! Only the 4KB size works at my minsoc. I am very happy the system is now friendly to me ;) - after I struggled with it a long time :P ... My data chain behaves ideally, just like in the simulation! Now I can concentrate on further steps in control software and HDL details. Thanks alot to you Raul and all other contributors of minsoc! Cheers Flo |
RE: minsoc: gdb set does not work
by rfajardo on Sep 27, 2013 |
rfajardo
Posts: 306 Joined: Jun 12, 2008 Last seen: Jan 6, 2020 |
||
Hello Flo,
I'm glad it worked. Congratulations! I remember that OpenRISC Release 1 implementation used in minsoc-1.0 has problems with the cache. Do you really need a cache? I believe the standard configuration does not have it in. Didn't the system work without cache? Are you still working on your Eclipse plugin-in for OpenRISC? That would be a valuable contribution too :). We would be very happy if you find the time to contribute with a board port or update the wiki. Remember to use the trunk repository to create a board port. But beware that the advanced debug system has problems with the trunk repository since it uses the latest OpenRISC. Best regards, Raul |
RE: minsoc: gdb set does not work
by flozn on Sep 29, 2013 |
flozn
Posts: 16 Joined: Jun 10, 2013 Last seen: Sep 22, 2020 |
||
Hi Raul!
"Do you really need a cache? ... Didn't the system work without cache?" That is an essential fact. Only *with* cache the self tests finished successfully ... "I believe the standard configuration does not have it in." Yes! Standard minsoc v1.0 is without caches for FPGA synthesis. "Are you still working on your Eclipse plugin-in for OpenRISC? That would be a valuable contribution too :). " I think you may confuse something ;) . This may be another "Flo" who is working on an Eclipse plugin. I do everything with text editors and command line tools ;). I'm going to add the board and the howto stuff to the wiki right after finishing my thesis! Last week I implemented the I2C slave (i2c_master_top.v) and found another pitfall: the asynchron reset is important and has to be defined! (synchron may be left unconnected) Otherwise some nice bus errors and selftest errors occur ... The last days I ported some c++ software written by me to the minsoc example software. The necessary steps for c++ may also be an interesting howto page. Cheers Flo |
RE: minsoc: gdb set does not work
by rfajardo on Sep 30, 2013 |
rfajardo
Posts: 306 Joined: Jun 12, 2008 Last seen: Jan 6, 2020 |
||
Yes. I was mixing up things. I'm sorry about that.
Ich drücke Dir die Daumen. |
RE: minsoc: gdb set does not work
by flozn on Oct 4, 2013 |
flozn
Posts: 16 Joined: Jun 10, 2013 Last seen: Sep 22, 2020 |
||
Hi Raul!
I am a few steps further but not however not at the finish ... My (already proven) c++ Code runs on my minsoc and uses the object files (e.g. uart.o and i2c.o) of minsoc. Some strange SIGBUS error occured in two cases (first is no problem): 1. the c++ code compiled with other optimization than -O0 2. calling i2c_trans of the *same* object file which runs fine in c-example The specific reason is that NPC=0x0 e.g. at i2c.cpp:"dat_list[ i ] = data->data" (l.lhz r3,0x2(r3) ). For compiling I used the example linkerscript of or32-elf-g++ and added the stack section (for minsoc reset code). I do not use newlib (minsoc reset,interrupt and exception handling instead) to save RAM-space (only the 32kB SRAM synthesize correctly). My fear is that the system may not be synthesized correctly :/ . I enabled the I and D cache in Hardware and get successfull self-test results. But maybe there is a cache problem because of this. For testing the caches I compiled the enable-functions of newlib/libs/or32/or1k-support-asm into minsoc. As you said - the caches not work with the minsoc version 1.0 . Disabling them e.g. via or1k-icache-disable() makes no change in the faulty behaviour :( . At the moment I try to get orpsocv2 running. Nevertheless it would be nice to solve the minsoc problem - I spent a lot of time in it... Cheers Flo |
RE: minsoc: gdb set does not work
by rfajardo on Oct 6, 2013 |
rfajardo
Posts: 306 Joined: Jun 12, 2008 Last seen: Jan 6, 2020 |
||
Hi Flo,
minsoc-dev should have working caches. But the debugger does not work so nicely. I think the C++ support of the compiler used by minsoc is not complete. You should grab yourself a current compiler for that. Even then, I'm not sure how far the C++ support currently is. Let's see if I can think of something else. Regards, Raul |
RE: minsoc: gdb set does not work
by flozn on Oct 6, 2013 |
flozn
Posts: 16 Joined: Jun 10, 2013 Last seen: Sep 22, 2020 |
||
Hi Raul!
Thanks for reply! I use a self build toolchain (or32-elf-g++) as described in http://opencores.org/or1k/OpenRISC_GNU_tool_chain . The toolchain of minsoc version 1.0 does not even support c++ (if I don't miss anything). In the next days I will try the development version and hopefully get some good results. The c++ code is a communication framework which allows to execute functions and get their results back to the host. On AVR I realized an update system with this framework. I don't think the c++ support is less finished on openrisc than on AVR. But who knows ... "Let's see if I can think of something else. " I would be very greatful for any idea! Regards, Flo |
RE: minsoc: gdb set does not work
by flozn on Oct 15, 2013 |
flozn
Posts: 16 Joined: Jun 10, 2013 Last seen: Sep 22, 2020 |
||
Hey Raul!
I got some minsoc news! Yesterday I adapted the dev version of minsoc to my Atlys board. Now 64k RAM synthesize without problems. Even stepping works! You changed some things ;) (e.g. interconnect definitions at hw/sw, uart base address via parameter, ...) but now the simple tests work. The bad news is my old SIGBUS problem occured again :/ ! The i2c_trans function crashs with parameters (mode, type) passed from c++. I think this may be some stack issue. This single SIGBUS error was solved by creating an extra function in c which only takes by value parameters (slave addr, reg addr, data. all uchar). Now c and c++ codes can start i2c actions without problems. I think there is some problem with the stack. After expanding the c++ code the SIGBUS error now occurs in the c++ code :( ... Is it not enough to define general purpose register 1 to the stack address once for the whole program (consisting of c *and* c++ code) ? Content of minsoc/sw/support/reset.S : l.movhi r1,hi(_stack) l.ori r1,r1,lo(_stack) Maybe you have some idea :) . Cheers! Flo |
RE: minsoc: gdb set does not work
by rfajardo on Oct 15, 2013 |
rfajardo
Posts: 306 Joined: Jun 12, 2008 Last seen: Jan 6, 2020 |
||
Watch out for a stack overflow. Output the stack register on error occurrence. If required, increase the stack size. I believe it is defined under board.h.
|