{23} Trac comments (3729 matches)

Results (2201 - 2300 of 3729)

Ticket Posixtime Author Newvalue
#1020 1300196215000000 thejimmyg This is now on DGU live.
#1006 1300105927000000 rgrp Not sure when i described that other process but I'm definitely of the opinion that we should: * Deprecate and remove stable branch * Deprecate and remove metastable branch (going forward we can use release branches for what we did with metastable in the past) May want to update BranchingPolicy to reflect this (also should probably have some statement about closing release branches and tagging) Reassigning to David Raznick as he is our release guru now.
#927 1300105638000000 rgrp Closing this ticket as #841 is minor. More work on docs can go in new tickets.
#1030 1300103984000000 amercader This includes: * Take out all references to harvestsource, harvestingjob and harvesteddocument in the rest API * Move the harvesting bits of ckan/lib/cli.py into ckanext-harvest. * Move ckan/controller/harvesting.py and cknan/model/harvesting.py to ckanext-harvest as well * Update ckanext-csw to be able to find the code it needs in the new place.
#1003 1300100411000000 rgrp Converted completely to backbone and now have fully-operational add dataset functionality. I'm closing this ticket now - further improvements can be in there own tickets.
#1005 1300100085000000 dread Moved to dgu trac ticket 869
#1006 1300098217000000 dread Reassigning to rgrp for response
#810 1300093797000000 rgrp Moving back to backlog for v1.4 as should be dependent on forms overhaul and seems to be problematic (and not that urgent).
#867 1299866685000000 rgrp This was a breaking change for loaders code. Obviously we don't have tests for that so would not have been noticed ... Fixed in cset:af81e54bd590/ckanclient
#1031 1299863222000000 rgrp Done, see cset:78d96b520679
#366 1299845781000000 pudo You're right, that's done!
#366 1299845116000000 dread I'm very pleased that this now works when you try to edit a package that is not allowed. Are there other circumstances we should cover or can we close this?
#913 1299841413000000 rgrp If at all possible this should uris from the OKF licenses registry at http://licenses.opendefinition.org/
#927 1299841206000000 rgrp Major update including notes of what has been done (where not a separate ticket) and addition of a few items.
#929 1299840884000000 rgrp I'm closing this ticket as a) most systems should install the licenses package (and hence have the licenses locally) and b) licenses service has now moved to s3 so should be very robust (see ticket:973 and http://knowledgeforge.net/okfn/tasks/ticket/605)
#904 1299840539000000 rgrp We're already now into improving the docs and ticket:927 is now reasonably detailed.
#1015 1299788821000000 rgrp Today kindly applied the sql fixes and I can confirm this is now fixed. Well done kindly for all the great work here.
#738 1299761436000000 thejimmyg This is now complete and on UAT.
#1028 1299760360000000 dread Fixed in cset 4d860a53fbad on 1.3.2 and merged to default.
#662 1299755259000000 dread We want this fixed for CLG customer (DGU), so have put in a quick fix into branch 3.1.2 cset:0010a709edf0 (and merged to default) as a stopgap whilst new forms are on their way.
#1025 1299751045000000 dread Thanks for pointing this out. Although changing the definition of 'editor' to not allow edit (as you admit) is a bit hacky, and I think prone to confusion. James and I weren't aware there was a precedent for doing it this way, but if we had, we may well have followed it. I mentioned the branch for this cset, in preference to the csets.
#1025 1299750097000000 rgrp (Link to changeset). Could you briefly clarify why this was needed in config -- we already have a process for putting things into a more restricted mode (see ticket:833) and have been working on creating a WUI to be able to do this automatically (see that ticket).
#1027 1299682082000000 pudo fixed in cset:532c3ad2743b
#1024 1299668648000000 pudo re-created this (#1027)
#1017 1299668555000000 pudo fixed in cset:86d49a775fd3
#1027 1299666326000000 pudo 1. home controller -> __before__ (check "site-read" on model.System) 2. user -> each individually (repoze-who pseudo action must not be blocked) * user-read (index/read/update pages for users) * user-create (register) 3. revision -> __before__ (check "site-read" on model.System) 4. tag -> site-read (__before__) functional/test_authz.py * denies site-read ... * checks for visitor / logged in user .. * checks you can still visit login
#1026 1299605128000000 dread Done in cset:2dde3bd563fd for branch 1.3.2
#1025 1299598708000000 dread Done in enh-1025-config-default-authz and goes into release 1.3.2.
#1021 1299518828000000 pudo fixed in cset:9f1b38add19f
#1023 1299514847000000 pudo Tried implementing this with AMQPs msg.requeue() and channel.basic_recover() but RabbitMQ yield a NOT_IMPLEMENTED error. Bit clueless on how to proceed.
#1022 1299512991000000 pudo We're now using fileConfig to configure the logger API from the worker config file and this enables us to use SMTPHandler to send out error messages on queue processing failures. Marking as fixed.
#956 1299489084000000 kindly cset:1305b9192d49
#1014 1299245293000000 sebbacon Run out of time for decoupling, but tests and README.txt written (including pointers about how to customise for anyone who needs to decouple in the future)
#1011 1299245206000000 sebbacon Merged to default https://bitbucket.org/okfn/ckan/changeset/e8217c317a8e
#1013 1299245157000000 sebbacon This is now resolved, but depends on core CKAN behaviour (specifically pluggable middleware and unicode-aware error pages) to function: https://bitbucket.org/okfn/ckan/changeset/c846794c1799
#971 1299245064000000 sebbacon folded into #1013
#1019 1299166930000000 pudo fixed in https://bitbucket.org/okfn/ckanext-webhooks/changeset/034647931921
#496 1299164106000000 thejimmyg Will has implemented this now and OS have confirmed their export to GeoNetwork works.
#427 1299164063000000 thejimmyg This is done in the latest release to test.
#1018 1299073340000000 dread Done in cset:e4167f8b3f80 on default
#663 1298913603000000 kindly cset:76a77439ecd0
#994 1298912830000000 kindly see cset:93188d42fc12
#1000 1298912726000000 kindly fixed cset:630513f550d5
#1015 1298902753000000 kindly The migration fixes should sort this out, but I will keep the ticket open to check.
#937 1298892547000000 sebbacon The current implementation I referenced above will be a good starting point. Work that remains: * Add download click tracking to individual download links (currently we just record page views for packages, not downloads) * Somehow cache the download stats against each package (the Google API is very slow); package reddis or sqlite or similar as a local storage for the extension * Expose download information in the relevant places in the UI (all users? package owners? where?) This is about 2 days' work. Unlikely to get it done in this sprint.
#1003 1298889293000000 rgrp Have now started refactor to use backbone and have basic inline editing working and started on Add dataset view.
#833 1298889104000000 rgrp In progress now (sysadmin view and update nearly done).
#962 1298889078000000 rgrp Nearly done.
#982 1298887980000000 dread Buildbot scripts now fixed.
#941 1298886391000000 wwitzel3 Continued work on the community plugin. I am still learning the layout of templates and how they work within ckan and getting figuring out Genshi templates so this is where most of the delay has been. I've been able to determine a pretty good plugin layout for extensions that create models. I am currently focusing on getting the rest of the UI in place and trying to determine the best way to get colander to do the desired validation beyond ensuring the form has all the elements. After todays work, I will push what I've done and I would like to walk through the design with someone at some point.
#363 1298840718000000 kindly revision objects are made everytime a new revision is made even if their are no changes.
#1011 1298825600000000 sebbacon Proposed implementation at https://bitbucket.org/okfn/ckan/changeset/187e65afb35f
#1011 1298824649000000 sebbacon The "external source" is an Oauth service. We need to lookup user groups from that service.
#1008 1298821826000000 rgrp I've removed the eval in cset:1b8fedeb7ab0 - the more general question about caching should go in a separate ticket.
#1011 1298821699000000 rgrp I agree that IAuthorizer is useful but not sure how it addresses the requirement of the ticket. AuthorizationGroups are already editable via the web interface at /authorizationgroup
#1011 1298820235000000 sebbacon On reflection, may as well make a Plugin interface called IAuthorizer, which allows customisation of get_authorization_groups, get_roles, and is_authorized....
#1010 1298740889000000 rgrp Meant this cset:d2651db566ef
#1010 1298733856000000 rgrp Complete see branch feature-1010-list-users and closing changeset cset:feature-1010-list-users.
#1009 1298638447000000 pudo Some more ideas: * /user should list users, sorted by number of packages contributed/editied * /user/{name}/packages shows a list of packages to which users have contributed
#1006 1298631145000000 dread This command is slightly different to your branch policy as of two weeks ago: {{{ stable: stable code metastable: (will soon be deprecated) for code preparing to be stable default: development HEAD }}} which I prefer. My ideal would be to get rid of the confusing name 'metastable' and unneeded 'stable' and start a new branch called 'released', which will act the same as 'master' in this diagram but with a more intuitive name: http://nvie.com/posts/a-successful-git-branching-model Then for each ckan instance we can either use the most recent release (from 'released') or choose a specific one (e.g. 'ckan-1.3' or even 'default' or 'enh-865' for getting latest features). This gives a good degree of flexibility, is more understandable to newbies and probably a more widely understood branching model.
#877 1298624165000000 rgrp Various tidying in https://bitbucket.org/okfn/ckanext-upload/changeset/0fad7aa7aa97 (success messages, permissions on uploaded file - public-read) and completed permissions in https://bitbucket.org/okfn/ckanext-upload/changeset/a83ce00a1266. Still need to integrate into general workflow (e.g. create a Resource on successful upload) but that is a separate item so this ticket is now done.
#944 1298571917000000 pudo Won't work on this for now - IATI is now running against plain CKAN but this is not deployed. We will continue work on this once IATI requests more functionality and shelf it for now.
#985 1298571248000000 pudo digitialiser.dk has been assigned to Stefan Marsirske to get him into this Framework, everything else is delayed.
#926 1298541597000000 anonymous Goals: We want the interface for updating an object to be loosely coupled to the method for updating it. We might update a Package from: - HTML forms - a REST API (using JSON) - a CLI (potentially using command line arguments, YaML, XML or ini files) Right now, data is validated using a form framework, even if we're not using forms. Data is written to the object as part of the forms framework (using the "sync()" method), making the process hard to customise and hard to discover. Instead, there should be a standard chain for: - deserialising untyped data (such as that received from an HTTP POST or parsed from a YaML file) into valid data - returning structured errors suitable for displaying to the user - saving the validated, deserialised data Ideally, it would look something like: schema = MySchemaDefinition() raw_data = open("raw.csv", "r").read() structured_data = to_python(raw_data, schema) try: validated = validate(python_data) myobject.update_from_dict(validated) return "Updated OK" except ValidationError, e: return "Error: %s" % e.to_dict() The inverse would be something like: structured_data = myobject.render_to_dict() raw_data.write(to_csv(structured_data, schema) print "Wrote CSV %s" % to_logformat(serialized_data, schema) The question of how to generate and display forms should be completely decoupled from this. It should be easy to write forms by hand, which means it should be simple to flatten the serialized data to key, value pairs, and match up any validation errors to each key. Optionally, a form widget generation framework is a nice-to-have, but not essential, as it is expected that, given enough time, the majority of forms will require manual coding to accomodate edge conditions. A form widget generation framework should be reasonably complete if it's worth trying at all, which means it should support things like: - nested fields (at least repeating, multi-value fieldsets) - widgets for dates and file uploads - internationalisation ...but note I'd settle for *no* widget generation Components of a serialisation / validation framework: - a simple, obvious way to define a schema - a lightweight validation implementation - simple interface for validators - easy to match validation errors to data structure items Overall, I'd like to see: - loose coupling, no framework dependencies - maximal test coverage - extensive documentation with readily available examples ## Findings I looked at flatland, formencode, FormAlchemy, formish, WTForms, Django, web2py, deform/colander, formconvert and web.py - **web2py** just helps build HTML from python, so isn't what I'm after at all - **web.py** has rudimentary validation which is only aimed at HTML forms and is hence tightly coupled with them. - **Django**'s forms are again tightly coupled to HTML forms (and their generation) - **FormAlchemy** similarly couples validation to forms, and is focussed on inferring a schema from a data model SQLAlchemy. - **WTForms** again focuses on Form generation and don't make itx easy to deserialise arbitrary data This leaves us with Flatland, Formencode, Formish, Colander/Peppercorn/Deform, and FormConvert. Having reviewed all of these, I rejected Formencode on the basis of its patchy documentation and relatively low unit test coverage. I also found it mixed concerns a bit much for my taste. Formish felt similarly sparsely documented. Of the remainder, I'd be happy using any of them, but opted for Colander in the end as it has the most exhaustive documentation and unit tests and has been used in production for a long time. FormConvert has a nice design but is a bit of a moving target at the moment -- worth revisiting in the future.
#1003 1298490126000000 rgrp Work so far in http://bitbucket.org/rgrp/ckanjs
#926 1298489517000000 rgrp @Seb: I believe this is now decided following discussion last week. Please could you detail results and close :)
#821 1298486642000000 dread Investigating several of these packages, it works for me (and David Raznick). For example ni_013_migrants_english_language_skills_and_knowledge, one resource is seen created in the diffs, is displayed in CKAN, in the API and in the dumps. Yet looking at the dump from 17/11/10 when this ticket was created, the resource didn't have a URI, which by the current model is a requirement, so it suggests the data was fine underneath, but it had problems displaying this field, and is now fixed.
#982 1298482394000000 dread Need to do this for older branches which isn't subject to #963.
#659 1298424109000000 nils.toedtmann Good idea. Listed this in my nagios ticket http://knowledgeforge.net/okfn/tasks/ticket/600
#659 1298379892000000 dread Smoketest scripts exist for exactly this in ckanext. It would be great to have this running on nagios. It is as simple as running: python blackbox/smoke.py -H ckan.net blackbox/ckan.net.profile.json See here for code: https://bitbucket.org/okfn/ckanext/src/default/blackbox
#931 1298379187000000 dread This was completed in ckanclient in cset:1bfefd7596d3
#805 1298379084000000 dread Migration tests added to buildbot using kindly's new nose option #965. Also removed legacy system of migration testing in: ckan/migration/tests and updated docs. cset:643673c7db3e
#993 1298373114000000 dread Fixed on 1.3 cset:7708c8b521ed and merged to default. Deployed to ckan.net.
#998 1298372171000000 dread Yes I agree - either of those sounds good. I think I've always used 'db init' in preference anyway.
#998 1298371191000000 anonymous I am happy to get rid of paster db create altogether as a compromise? Or add a depreciation warning to it?
#998 1298369862000000 dread 'paster db create' (or init) should do exactly what we ask. Surely we should simply tell people to use 'paster db upgrade' instead?
#505 1298368280000000 dread Now complete
#893 1298293527000000 thejimmyg We don't understand the use case for this requirement. Closing for now until a use case can be demonstrated.
#963 1298284252000000 thejimmyg You can now get CKAN from the repository http://apt-alpha.ckan.org/debian
#482 1298284158000000 thejimmyg This is now 6 months old and there still doesn't seem to be a requirement for this. Marking wontfix and we can come back to it if it comes up again.
#435 1298284084000000 thejimmyg Haven't seen this myself and it is 6 months old now.
#936 1298283172000000 thejimmyg Hi Wayne, I'm assigning this to you but it isn't a priority yet. We'll put it in a sprint when it is time to do it. Cheers, James
#430 1298283075000000 thejimmyg We are doing other refactoring that is more important than this such as: * Plugin APIs to enable extensions * Form refactroing This ticket is 6 months old so closing.
#992 1298060474000000 rgrp Fixed in cset:08548ef8f0e9
#991 1298037717000000 dread Fixed in cset:56cccbbb9d1a in time for ckan 1.3 release. This did not affect previous releases.
#963 1297850773000000 thejimmyg We will also remove all the different pip files as part of this fixing #982 at the same time.
#982 1297850732000000 anonymous This is now rolled into #963. Marking as duplicate. People can get the pip from a branch over HTTP like this: https://bitbucket.org/okfn/ckan/src/<branch-name>/path/to/file/you/want
#986 1297812401000000 wwitzel3 https://bitbucket.org/okfn/ckanext-qa/src/be57e20c60ef/
#715 1297796784000000 pudo fixed in cset:69c4210f635a
#808 1297783658000000 pudo implemented in cset:8200247e74e9
#983 1297773407000000 dread Error was tracked down to cset:214a8f9fc1c2 (26-9-2010): upgrade_db called validate_authorization_setup() which calls setup_default_user_roles(System()) Fixed in cset:9f51a1c8ac83 for 1.3 branch and merged to default.
#989 1297706620000000 kindly I do not think we need to 'extend the model' if you intend to make the migrations separate. If the schema is decoupled, then there are no problems. So each plugin can have its own model and use sqlalchemy independently i.e have their own metadata, classes and mappers. They do not have to even use sqlalchemy. What I mean is that there is no need to do anything apart from. * Agree on a naming convention of the plugin tables (including their own migrate table each) * Agree to the rule that no plugin can add a column to an existing table. * Agree that no table can have a (database level) foreign key constraint between the core tables and itself in either direction. They *can* have implied sqlalchemy level joins. * Maybe have a hook that on db upgrade all plugins are upgraded. Each plugin will have to redefine the tables, classes and mappers they need to join onto the core tables themselves. reusing/extending the core model will not be worth the trouble. This seems to cover your use cases and this way everything is nicely decoupled. Best of all there is very little work to do...
#989 1297700818000000 pudo Kindly, I agree - it would be much preferable to have independent storage for plugins and this would be easy to do if we were using another type of storage already. As it stands, however, our storage mechanism is SQL. I think we should use it for what it is as much as possible and do the weird, vertical stuff (k,v tables, swapping to redis) only if we really need it. For everything else: lets use SQL as it was intended. Examples: * We want to develop an apps catalogue as a CKAN plugin. While we could certainly put this in Redis, there is no reason why we can't have the following table: application (id, name, title, description, author, project_url, site_url, code_url, image). * A watchlist plugin could essentially work on UUIDs alone. What you'd end up with is something like this: watch (id, user, scope_id). Re migrations you're right, but my first intention would be to handle that seperatly for each plugin (i.e. they need to have their own migration repositories that they keep track of, e.g. via an apps_migrate_version table)
#989 1297700363000000 kindly It would be nice to know some use cases. I think that plugins should control their own storage, or share a storage that is designed to be flexible (mongo, redis ...). We do not seem to be able to keep our current migrate repository in sync let alone add plugins to the mix.
#937 1297689859000000 sebbacon (and it would also need some proper caching as the GA API is very slow)
#937 1297689781000000 sebbacon I did a very quick hacky thing at the end of last week on top of the "insert google analytics code" extension we discussed, to work out "most popular packages" based off data harvested from the Google Analytics API. Needs making generic, tests etc but could be a starting point: https://bitbucket.org/sebbacon/ckanext-googleanalytics/src
#941 1297689750000000 thejimmyg The system will need some way of plugging in the model. See ticket #989 for progress on this. Other ideas: * The apps will need an image upload * We might like a voting system for apps and ideas, potentially that could be re-used later. Let's discuss the above ideas after the basic functionality is in place.
#801 1297686706000000 thejimmyg We do have a requirement for this now. The job model has changed so that it is hidden from the user. We therefore want to know the timestamp the job started and the timestamp it finished. We'll therefore need migrations adding too.
#794 1297686491000000 thejimmyg Actually, for the timebeing we will match but not do anything with that matched information, until there is a clear use case. Publisher is simply the publisher for which the source was registered. Closing this ticket.
#427 1297686183000000 thejimmyg Documentation of the licenses service was handled in #973. Changing this ticket to be about matching the license service in UKLII.
Note: See TracReports for help on using and creating reports.